Chapter 6: Function Calling and Tool Use
Practical Exercises β Chapter 6
Test your understanding and reinforce your skills by working through the following exercises. These tasks will help you set up function calls, define function schemas, chain external tool calls, and parse API responses.
Exercise 1: Basic Function Calling
Task:
Construct a conversation where the assistant recognizes a request that requires a calculation. Define a function called calculate_sum
that takes two numbers, and simulate an API call where the model triggers a function call to compute the sum of 8 and 15.
Solution:
import openai
import os
import json
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
# Define the function schema for calculating the sum.
function_definitions = [
{
"name": "calculate_sum",
"description": "Calculates the sum of two numbers.",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "The first number."},
"b": {"type": "number", "description": "The second number."}
},
"required": ["a", "b"]
}
}
]
# Construct a multi-turn conversation to trigger the function call.
messages = [
{"role": "system", "content": "You are a helpful assistant who can perform calculations."},
{"role": "user", "content": "What is the sum of 8 and 15?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto", # Let the model decide if it needs to call the function.
max_tokens=100,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
print("Function Call Triggered:")
print("Function Name:", function_call_info.get("name"))
print("Arguments:", function_call_info.get("arguments"))
else:
print("Textual Response:")
print(response["choices"][0]["message"]["content"])
This exercise demonstrates how to set up a function call, define a function schema, and check whether the model elects to call the function based on the conversation context.
Exercise 2: Defining a Function Schema for a Custom Operation
Task:
Define a new function schema for a function named calculate_difference
that computes the difference between two numbers. Then simulate a conversation where a user asks, "What is the difference between 20 and 5?" and handle the function call.
Solution:
# Define the function schema for calculating the difference.
function_definitions = [
{
"name": "calculate_difference",
"description": "Calculates the difference between two numbers (first minus second).",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "The number from which to subtract."},
"b": {"type": "number", "description": "The number to subtract."}
},
"required": ["a", "b"]
}
}
]
# Conversation setup
messages = [
{"role": "system", "content": "You are a calculator assistant."},
{"role": "user", "content": "Can you tell me the difference between 20 and 5?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto",
max_tokens=100,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
print("Function Call Triggered:")
print("Function Name:", function_call_info.get("name"))
print("Arguments:", function_call_info.get("arguments"))
else:
print("Response:")
print(response["choices"][0]["message"]["content"])
This exercise shows you how to create a custom function schema and set up a conversation that can trigger the function call.
Exercise 3: Tool Use and API Chaining
Task:
Simulate a scenario where the user asks about the current weather in "Los Angeles". First, define a function get_weather
to simulate retrieving weather data. Then integrate this function into your conversation through API chaining.
Solution:
def get_weather(city):
# Simulated weather data for demonstration purposes.
weather_data = {
"Los Angeles": {"temperature": 26, "condition": "sunny"},
"New York": {"temperature": 18, "condition": "cloudy"},
"San Francisco": {"temperature": 15, "condition": "foggy"}
}
return weather_data.get(city, {"temperature": None, "condition": "unknown"})
# Define the function schema for weather retrieval.
function_definitions = [
{
"name": "get_weather",
"description": "Fetches current weather data for a specified city.",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The city to get weather information for."}
},
"required": ["city"]
}
}
]
# Construct the conversation.
messages = [
{"role": "system", "content": "You are a knowledgeable assistant that can provide weather updates."},
{"role": "user", "content": "What is the current weather in Los Angeles?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto",
max_tokens=150,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
arguments_str = function_call_info.get("arguments", "{}")
# Parse the arguments, expecting a JSON dictionary.
import json
arguments = json.loads(arguments_str)
city = arguments.get("city", "Los Angeles")
# Call the simulated external weather function.
weather_info = get_weather(city)
follow_up_message = (
f"The weather in {city} is currently {weather_info['condition']} with a temperature of {weather_info['temperature']}°C."
)
# Append the function result to the conversation.
messages.append({
"role": "assistant",
"content": follow_up_message
})
# Optionally, generate a final summary response.
final_response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
max_tokens=100,
temperature=0.5
)
print("Final Chained Response:")
print(final_response["choices"][0]["message"]["content"])
else:
print("Response:")
print(response["choices"][0]["message"]["content"])
This exercise demonstrates how to chain an external function call into the conversation flow, retrieve data from a simulated weather service, and integrate that information back into the conversation.
Exercise 4: Parsing a Streaming Response
Task:
Implement a streaming API call that outputs the generated text as soon as it is produced. Use a simple query and display each streamed chunk.
Solution:
import openai
import os
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
messages = [
{"role": "system", "content": "You are a friendly assistant who provides inspirational quotes."},
{"role": "user", "content": "Share an inspirational quote about perseverance."}
]
response_stream = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
max_tokens=100,
temperature=0.6,
stream=True # Enable streaming mode.
)
print("Streaming Response:")
for chunk in response_stream:
if "choices" in chunk:
part = chunk["choices"][0].get("delta", {}).get("content", "")
print(part, end="", flush=True)
print("\nStreaming complete!")
This exercise helps you get familiar with handling streaming responses, enabling your application to provide real-time feedback.
These exercises cover the key aspects of function calling and tool use—from setting up function calls and defining function schemas, to chaining multiple API calls and handling real-time streaming responses. By completing these exercises, you’ll gain practical experience in integrating dynamic operations into your conversational AI applications.
Practical Exercises β Chapter 6
Test your understanding and reinforce your skills by working through the following exercises. These tasks will help you set up function calls, define function schemas, chain external tool calls, and parse API responses.
Exercise 1: Basic Function Calling
Task:
Construct a conversation where the assistant recognizes a request that requires a calculation. Define a function called calculate_sum
that takes two numbers, and simulate an API call where the model triggers a function call to compute the sum of 8 and 15.
Solution:
import openai
import os
import json
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
# Define the function schema for calculating the sum.
function_definitions = [
{
"name": "calculate_sum",
"description": "Calculates the sum of two numbers.",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "The first number."},
"b": {"type": "number", "description": "The second number."}
},
"required": ["a", "b"]
}
}
]
# Construct a multi-turn conversation to trigger the function call.
messages = [
{"role": "system", "content": "You are a helpful assistant who can perform calculations."},
{"role": "user", "content": "What is the sum of 8 and 15?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto", # Let the model decide if it needs to call the function.
max_tokens=100,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
print("Function Call Triggered:")
print("Function Name:", function_call_info.get("name"))
print("Arguments:", function_call_info.get("arguments"))
else:
print("Textual Response:")
print(response["choices"][0]["message"]["content"])
This exercise demonstrates how to set up a function call, define a function schema, and check whether the model elects to call the function based on the conversation context.
Exercise 2: Defining a Function Schema for a Custom Operation
Task:
Define a new function schema for a function named calculate_difference
that computes the difference between two numbers. Then simulate a conversation where a user asks, "What is the difference between 20 and 5?" and handle the function call.
Solution:
# Define the function schema for calculating the difference.
function_definitions = [
{
"name": "calculate_difference",
"description": "Calculates the difference between two numbers (first minus second).",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "The number from which to subtract."},
"b": {"type": "number", "description": "The number to subtract."}
},
"required": ["a", "b"]
}
}
]
# Conversation setup
messages = [
{"role": "system", "content": "You are a calculator assistant."},
{"role": "user", "content": "Can you tell me the difference between 20 and 5?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto",
max_tokens=100,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
print("Function Call Triggered:")
print("Function Name:", function_call_info.get("name"))
print("Arguments:", function_call_info.get("arguments"))
else:
print("Response:")
print(response["choices"][0]["message"]["content"])
This exercise shows you how to create a custom function schema and set up a conversation that can trigger the function call.
Exercise 3: Tool Use and API Chaining
Task:
Simulate a scenario where the user asks about the current weather in "Los Angeles". First, define a function get_weather
to simulate retrieving weather data. Then integrate this function into your conversation through API chaining.
Solution:
def get_weather(city):
# Simulated weather data for demonstration purposes.
weather_data = {
"Los Angeles": {"temperature": 26, "condition": "sunny"},
"New York": {"temperature": 18, "condition": "cloudy"},
"San Francisco": {"temperature": 15, "condition": "foggy"}
}
return weather_data.get(city, {"temperature": None, "condition": "unknown"})
# Define the function schema for weather retrieval.
function_definitions = [
{
"name": "get_weather",
"description": "Fetches current weather data for a specified city.",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The city to get weather information for."}
},
"required": ["city"]
}
}
]
# Construct the conversation.
messages = [
{"role": "system", "content": "You are a knowledgeable assistant that can provide weather updates."},
{"role": "user", "content": "What is the current weather in Los Angeles?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto",
max_tokens=150,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
arguments_str = function_call_info.get("arguments", "{}")
# Parse the arguments, expecting a JSON dictionary.
import json
arguments = json.loads(arguments_str)
city = arguments.get("city", "Los Angeles")
# Call the simulated external weather function.
weather_info = get_weather(city)
follow_up_message = (
f"The weather in {city} is currently {weather_info['condition']} with a temperature of {weather_info['temperature']}°C."
)
# Append the function result to the conversation.
messages.append({
"role": "assistant",
"content": follow_up_message
})
# Optionally, generate a final summary response.
final_response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
max_tokens=100,
temperature=0.5
)
print("Final Chained Response:")
print(final_response["choices"][0]["message"]["content"])
else:
print("Response:")
print(response["choices"][0]["message"]["content"])
This exercise demonstrates how to chain an external function call into the conversation flow, retrieve data from a simulated weather service, and integrate that information back into the conversation.
Exercise 4: Parsing a Streaming Response
Task:
Implement a streaming API call that outputs the generated text as soon as it is produced. Use a simple query and display each streamed chunk.
Solution:
import openai
import os
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
messages = [
{"role": "system", "content": "You are a friendly assistant who provides inspirational quotes."},
{"role": "user", "content": "Share an inspirational quote about perseverance."}
]
response_stream = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
max_tokens=100,
temperature=0.6,
stream=True # Enable streaming mode.
)
print("Streaming Response:")
for chunk in response_stream:
if "choices" in chunk:
part = chunk["choices"][0].get("delta", {}).get("content", "")
print(part, end="", flush=True)
print("\nStreaming complete!")
This exercise helps you get familiar with handling streaming responses, enabling your application to provide real-time feedback.
These exercises cover the key aspects of function calling and tool use—from setting up function calls and defining function schemas, to chaining multiple API calls and handling real-time streaming responses. By completing these exercises, you’ll gain practical experience in integrating dynamic operations into your conversational AI applications.
Practical Exercises β Chapter 6
Test your understanding and reinforce your skills by working through the following exercises. These tasks will help you set up function calls, define function schemas, chain external tool calls, and parse API responses.
Exercise 1: Basic Function Calling
Task:
Construct a conversation where the assistant recognizes a request that requires a calculation. Define a function called calculate_sum
that takes two numbers, and simulate an API call where the model triggers a function call to compute the sum of 8 and 15.
Solution:
import openai
import os
import json
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
# Define the function schema for calculating the sum.
function_definitions = [
{
"name": "calculate_sum",
"description": "Calculates the sum of two numbers.",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "The first number."},
"b": {"type": "number", "description": "The second number."}
},
"required": ["a", "b"]
}
}
]
# Construct a multi-turn conversation to trigger the function call.
messages = [
{"role": "system", "content": "You are a helpful assistant who can perform calculations."},
{"role": "user", "content": "What is the sum of 8 and 15?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto", # Let the model decide if it needs to call the function.
max_tokens=100,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
print("Function Call Triggered:")
print("Function Name:", function_call_info.get("name"))
print("Arguments:", function_call_info.get("arguments"))
else:
print("Textual Response:")
print(response["choices"][0]["message"]["content"])
This exercise demonstrates how to set up a function call, define a function schema, and check whether the model elects to call the function based on the conversation context.
Exercise 2: Defining a Function Schema for a Custom Operation
Task:
Define a new function schema for a function named calculate_difference
that computes the difference between two numbers. Then simulate a conversation where a user asks, "What is the difference between 20 and 5?" and handle the function call.
Solution:
# Define the function schema for calculating the difference.
function_definitions = [
{
"name": "calculate_difference",
"description": "Calculates the difference between two numbers (first minus second).",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "The number from which to subtract."},
"b": {"type": "number", "description": "The number to subtract."}
},
"required": ["a", "b"]
}
}
]
# Conversation setup
messages = [
{"role": "system", "content": "You are a calculator assistant."},
{"role": "user", "content": "Can you tell me the difference between 20 and 5?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto",
max_tokens=100,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
print("Function Call Triggered:")
print("Function Name:", function_call_info.get("name"))
print("Arguments:", function_call_info.get("arguments"))
else:
print("Response:")
print(response["choices"][0]["message"]["content"])
This exercise shows you how to create a custom function schema and set up a conversation that can trigger the function call.
Exercise 3: Tool Use and API Chaining
Task:
Simulate a scenario where the user asks about the current weather in "Los Angeles". First, define a function get_weather
to simulate retrieving weather data. Then integrate this function into your conversation through API chaining.
Solution:
def get_weather(city):
# Simulated weather data for demonstration purposes.
weather_data = {
"Los Angeles": {"temperature": 26, "condition": "sunny"},
"New York": {"temperature": 18, "condition": "cloudy"},
"San Francisco": {"temperature": 15, "condition": "foggy"}
}
return weather_data.get(city, {"temperature": None, "condition": "unknown"})
# Define the function schema for weather retrieval.
function_definitions = [
{
"name": "get_weather",
"description": "Fetches current weather data for a specified city.",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The city to get weather information for."}
},
"required": ["city"]
}
}
]
# Construct the conversation.
messages = [
{"role": "system", "content": "You are a knowledgeable assistant that can provide weather updates."},
{"role": "user", "content": "What is the current weather in Los Angeles?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto",
max_tokens=150,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
arguments_str = function_call_info.get("arguments", "{}")
# Parse the arguments, expecting a JSON dictionary.
import json
arguments = json.loads(arguments_str)
city = arguments.get("city", "Los Angeles")
# Call the simulated external weather function.
weather_info = get_weather(city)
follow_up_message = (
f"The weather in {city} is currently {weather_info['condition']} with a temperature of {weather_info['temperature']}°C."
)
# Append the function result to the conversation.
messages.append({
"role": "assistant",
"content": follow_up_message
})
# Optionally, generate a final summary response.
final_response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
max_tokens=100,
temperature=0.5
)
print("Final Chained Response:")
print(final_response["choices"][0]["message"]["content"])
else:
print("Response:")
print(response["choices"][0]["message"]["content"])
This exercise demonstrates how to chain an external function call into the conversation flow, retrieve data from a simulated weather service, and integrate that information back into the conversation.
Exercise 4: Parsing a Streaming Response
Task:
Implement a streaming API call that outputs the generated text as soon as it is produced. Use a simple query and display each streamed chunk.
Solution:
import openai
import os
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
messages = [
{"role": "system", "content": "You are a friendly assistant who provides inspirational quotes."},
{"role": "user", "content": "Share an inspirational quote about perseverance."}
]
response_stream = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
max_tokens=100,
temperature=0.6,
stream=True # Enable streaming mode.
)
print("Streaming Response:")
for chunk in response_stream:
if "choices" in chunk:
part = chunk["choices"][0].get("delta", {}).get("content", "")
print(part, end="", flush=True)
print("\nStreaming complete!")
This exercise helps you get familiar with handling streaming responses, enabling your application to provide real-time feedback.
These exercises cover the key aspects of function calling and tool use—from setting up function calls and defining function schemas, to chaining multiple API calls and handling real-time streaming responses. By completing these exercises, you’ll gain practical experience in integrating dynamic operations into your conversational AI applications.
Practical Exercises β Chapter 6
Test your understanding and reinforce your skills by working through the following exercises. These tasks will help you set up function calls, define function schemas, chain external tool calls, and parse API responses.
Exercise 1: Basic Function Calling
Task:
Construct a conversation where the assistant recognizes a request that requires a calculation. Define a function called calculate_sum
that takes two numbers, and simulate an API call where the model triggers a function call to compute the sum of 8 and 15.
Solution:
import openai
import os
import json
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
# Define the function schema for calculating the sum.
function_definitions = [
{
"name": "calculate_sum",
"description": "Calculates the sum of two numbers.",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "The first number."},
"b": {"type": "number", "description": "The second number."}
},
"required": ["a", "b"]
}
}
]
# Construct a multi-turn conversation to trigger the function call.
messages = [
{"role": "system", "content": "You are a helpful assistant who can perform calculations."},
{"role": "user", "content": "What is the sum of 8 and 15?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto", # Let the model decide if it needs to call the function.
max_tokens=100,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
print("Function Call Triggered:")
print("Function Name:", function_call_info.get("name"))
print("Arguments:", function_call_info.get("arguments"))
else:
print("Textual Response:")
print(response["choices"][0]["message"]["content"])
This exercise demonstrates how to set up a function call, define a function schema, and check whether the model elects to call the function based on the conversation context.
Exercise 2: Defining a Function Schema for a Custom Operation
Task:
Define a new function schema for a function named calculate_difference
that computes the difference between two numbers. Then simulate a conversation where a user asks, "What is the difference between 20 and 5?" and handle the function call.
Solution:
# Define the function schema for calculating the difference.
function_definitions = [
{
"name": "calculate_difference",
"description": "Calculates the difference between two numbers (first minus second).",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "The number from which to subtract."},
"b": {"type": "number", "description": "The number to subtract."}
},
"required": ["a", "b"]
}
}
]
# Conversation setup
messages = [
{"role": "system", "content": "You are a calculator assistant."},
{"role": "user", "content": "Can you tell me the difference between 20 and 5?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto",
max_tokens=100,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
print("Function Call Triggered:")
print("Function Name:", function_call_info.get("name"))
print("Arguments:", function_call_info.get("arguments"))
else:
print("Response:")
print(response["choices"][0]["message"]["content"])
This exercise shows you how to create a custom function schema and set up a conversation that can trigger the function call.
Exercise 3: Tool Use and API Chaining
Task:
Simulate a scenario where the user asks about the current weather in "Los Angeles". First, define a function get_weather
to simulate retrieving weather data. Then integrate this function into your conversation through API chaining.
Solution:
def get_weather(city):
# Simulated weather data for demonstration purposes.
weather_data = {
"Los Angeles": {"temperature": 26, "condition": "sunny"},
"New York": {"temperature": 18, "condition": "cloudy"},
"San Francisco": {"temperature": 15, "condition": "foggy"}
}
return weather_data.get(city, {"temperature": None, "condition": "unknown"})
# Define the function schema for weather retrieval.
function_definitions = [
{
"name": "get_weather",
"description": "Fetches current weather data for a specified city.",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The city to get weather information for."}
},
"required": ["city"]
}
}
]
# Construct the conversation.
messages = [
{"role": "system", "content": "You are a knowledgeable assistant that can provide weather updates."},
{"role": "user", "content": "What is the current weather in Los Angeles?"}
]
response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
functions=function_definitions,
function_call="auto",
max_tokens=150,
temperature=0.5
)
if response["choices"][0].get("finish_reason") == "function_call":
function_call_info = response["choices"][0]["message"]["function_call"]
arguments_str = function_call_info.get("arguments", "{}")
# Parse the arguments, expecting a JSON dictionary.
import json
arguments = json.loads(arguments_str)
city = arguments.get("city", "Los Angeles")
# Call the simulated external weather function.
weather_info = get_weather(city)
follow_up_message = (
f"The weather in {city} is currently {weather_info['condition']} with a temperature of {weather_info['temperature']}°C."
)
# Append the function result to the conversation.
messages.append({
"role": "assistant",
"content": follow_up_message
})
# Optionally, generate a final summary response.
final_response = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
max_tokens=100,
temperature=0.5
)
print("Final Chained Response:")
print(final_response["choices"][0]["message"]["content"])
else:
print("Response:")
print(response["choices"][0]["message"]["content"])
This exercise demonstrates how to chain an external function call into the conversation flow, retrieve data from a simulated weather service, and integrate that information back into the conversation.
Exercise 4: Parsing a Streaming Response
Task:
Implement a streaming API call that outputs the generated text as soon as it is produced. Use a simple query and display each streamed chunk.
Solution:
import openai
import os
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")
messages = [
{"role": "system", "content": "You are a friendly assistant who provides inspirational quotes."},
{"role": "user", "content": "Share an inspirational quote about perseverance."}
]
response_stream = openai.ChatCompletion.create(
model="gpt-4o",
messages=messages,
max_tokens=100,
temperature=0.6,
stream=True # Enable streaming mode.
)
print("Streaming Response:")
for chunk in response_stream:
if "choices" in chunk:
part = chunk["choices"][0].get("delta", {}).get("content", "")
print(part, end="", flush=True)
print("\nStreaming complete!")
This exercise helps you get familiar with handling streaming responses, enabling your application to provide real-time feedback.
These exercises cover the key aspects of function calling and tool use—from setting up function calls and defining function schemas, to chaining multiple API calls and handling real-time streaming responses. By completing these exercises, you’ll gain practical experience in integrating dynamic operations into your conversational AI applications.