Using ChatGPT Chat FunctionCall from Java
Introduction
As artificial intelligence and chatbots continue to gain popularity, integrating functions into chat conversations
This developer notebook guides implementing and integrating functions
The latest models (gpt-3.5-turbo-0613 and gpt-4-0613) have been fine-tuned to detect when a function should be called based on the input and to respond with JSON that follows the function signature. However, this power also comes with great responsibility due to potential risks. Users must build some user confirmation before taking actions on behalf of users that could impact the world (such as sending an email, posting political views online, buying something, etc.).
The notebook begins by explaining how to define a function by creating a dictionary that maps function names
The model will intelligently select and output a JSON object, called a functionCall, containing names of the functions and arguments to call those functions. Note that the Chat Completions API does not call the function but instead generates JSON that the user can use to call the function in their code.
Why it is important
This is a powerful concept and allows you a way to combine up-to-date data from your business domain with ChatGPT.
Function callbacks get used a lot by ChatGPT plugins.
Function calling enables you to add more context to the output from ChatGPT and allows you to get structured data from a model from ChatGPT. This can be used to:
Basic steps
The basic steps for function calls is:
Code breakdown
Let’s look at some example code.
Here is the complete code listing of our example
package com.cloudurable.jai.examples;
import com.cloudurable.jai.OpenAIClient;
import com.cloudurable.jai.model.text.completion.chat.ChatRequest;
import com.cloudurable.jai.model.text.completion.chat.ChatResponse;
import com.cloudurable.jai.model.text.completion.chat.Message;
import com.cloudurable.jai.model.text.completion.chat.Role;
import com.cloudurable.jai.model.text.completion.chat.function.*;
import com.cloudurable.jai.util.JsonSerializer;
import io.nats.jparse.node.ObjectNode;
import java.util.HashMap;
import java.util.Map;
import java.util.Optional;
import java.util.function.Function;
public class WeatherFunctionCallExample {
private final OpenAIClient client;
/**
* Holds the function mappings.
*/
private final Map<String, Function<ObjectNode, String>> functionMap = new HashMap<>();
public WeatherFunctionCallExample(OpenAIClient client) {
this.client = client;
functionMap.put("get_current_weather", this::getCurrentWeather);
}
/**
*
# Example dummy function hard coded to return the same weather for two cities and a default for other cities.
# In production, this could be your backend API or an external API
* @param objectNode arguments from chat GPT.
* @return JSON string
*/
public String getCurrentWeather(final ObjectNode objectNode) {
final String location = objectNode.getString("location");
final String unit = Optional.ofNullable(objectNode.get("unit"))
.map(node->node.asScalar().stringValue()).orElse("fahrenheit");
final JsonSerializer json = new JsonSerializer();
json.startObject();
json.addAttribute("location", location);
json.addAttribute("unit", unit);
switch (location) {
case "Austin, TX":
json.addAttribute("temperature", 92);
json.endObject();
return json.toString();
case "Boston, MA":
json.addAttribute("temperature", 72);
json.endObject();
return json.toString();
default:
json.addAttribute("temperature", 70);
json.endObject();
return json.toString();
}
}
public static void main(final String... args) {
try {
final var client = OpenAIClient.builder().setApiKey(System.getenv("OPENAI_API_KEY")).build();
new WeatherFunctionCallExample(client).runConversation();
} catch (Exception ex) {
ex.printStackTrace();
}
}
public void runConversation() {
final var getCurrentWeatherFunc = getFunctionDefinition();
final var message = Message.builder().role(Role.USER)
.content("What's the weather like in Boston in fahrenheit?").build();
final var chatBuilder = ChatRequest.builder()
.model("gpt-3.5-turbo-0613")
.addMessage(message)
.addFunction(getCurrentWeatherFunc)
.functionalCall(ChatRequest.AUTO);
final var chatRequest = chatBuilder.build();
final var chatResponse = client.chat(chatRequest);
chatResponse.getResponse().ifPresent(
response -> handleFunctionCallback(chatBuilder, response));
System.out.println(chatResponse.getStatusCode().orElse(666));
System.out.println(chatResponse.getStatusMessage().orElse(""));
chatResponse.getException().ifPresent(e-> e.printStackTrace());
}
private void handleFunctionCallback(final ChatRequest.Builder chatBuilder, final ChatResponse chatResponse) {
var responseMessage = chatResponse.getChoices().get(0).getMessage();
var functionCall = responseMessage.getFunctionCall();
var functionResponse = getFunctionResponse(functionCall);
chatBuilder.addMessage(Message.builder().name(functionCall.getName()).role(Role.FUNCTION)
.content(functionResponse)
.build());
var response = client.chat(chatBuilder.build());
response.getResponse().ifPresent(chatResponse1 ->
System.out.println(chatResponse1.getChoices().get(0).getMessage().getContent()));
if (response.getStatusMessage().isPresent()) {
System.out.println(response.getStatusMessage().orElse(""));
}
}
private String getFunctionResponse(FunctionalCall functionCall) {
String functionResponse = "";
if (functionCall !=null && functionMap.containsKey(functionCall.getName())) {
functionResponse = functionMap.get(functionCall.getName()).apply(functionCall.getArguments());
}
return functionResponse;
}
private static FunctionDef getFunctionDefinition() {
return FunctionDef.builder().name("get_current_weather")
.description("Get the current weather in a given location")
.setParameters(
ObjectParameter.objectParamBuilder()
.addParameter(Parameter.builder().name("location")
.description("The city and state, e.g. Austin, TX").build())
.addParameter(
EnumParameter.enumBuilder()
.name("unit").enumValues("celsius", "fahrenheit").build())
.required("location")
.build()
).build();
}
}
If that is unclear, let me break it down based on our defined steps. Then we will show the messages flying across the wire between the client application and Chat GPT.
1.1 Call the ChatGPT chat with the user query
public void runConversation() {
// Call the ChatGPT chat with the user query.
final var message = Message.builder().role(Role.USER)
.content("What's the weather like in Boston in fahrenheit?").build();
final var chatBuilder = ChatRequest.builder()
.model("gpt-3.5-turbo-0613")
.addMessage(message) // user query
...
.functionalCall(ChatRequest.AUTO);
The runConversation() establishes a conversation with the ChatGPT model by providing a user query and configuring the chat request. Next, it prepares the conversation to handle functional calls, allowing the model to invoke specific functions as needed during the conversation.
This **runConversation()**demonstrates how to use the ChatGPT model to run a conversation and make a function call. Let's break it down step by step:
1.2 And include the set of functions defined in the functions parameter.
private static FunctionDef getFunctionDefinition() {
return FunctionDef.builder().name("get_current_weather")
.description("Get the current weather in a given location")
.setParameters(
ObjectParameter.objectParamBuilder()
.addParameter(Parameter.builder().name("location")
.description("The city and state, e.g. Austin, TX")
.build())
.addParameter(
EnumParameter.enumBuilder()
.name("unit")
.enumValues("celsius", "fahrenheit")
.build())
.required("location")
.build()
).build();
}
public void runConversation() {
...
// And include the set of functions defined in the functions parameter.
final var getCurrentWeatherFunc = getFunctionDefinition();// defined above
final var chatBuilder = ChatRequest.builder()
.model("gpt-3.5-turbo-0613")
.addMessage(message)
.addFunction(getCurrentWeatherFunc) //function definitions
.functionalCall(ChatRequest.AUTO);
The runConversation() method serves as an entry point for executing a conversation with the ChatGPT model.
1.2 And include the set of functions defined getFunctionDefinition breakdown
private static FunctionDef getFunctionDefinition() {
return FunctionDef.builder().name("get_current_weather")
.description("Get the current weather in a given location")
.setParameters(
ObjectParameter.objectParamBuilder()
.addParameter(Parameter.builder().name("location")
.description("The city and state, e.g. Austin, TX")
.build())
.addParameter(
EnumParameter.enumBuilder()
.name("unit")
.enumValues("celsius", "fahrenheit")
.build())
.required("location")
.build()
).build();
}
The method getFunctionDefinition(), which returns a FunctionDef object. This is a function definition for use in the ChatGPT conversation when we pass the menu of functions it can call.
Here's a breakdown of what the code does:
This getFunctionDefinition defines a function named "get_current_weather" that takes two parameters: "location" and "unit". It represents a function that retrieves the current weather for a given location in Celsius or Fahrenheit. This function can be used in the ChatGPT conversation to provide weather information based on user queries.
2.1 ChatGPT can choose to call a function.
Recommended by LinkedIn
public void runConversation() {
final var chatRequest = ...
final var chatResponse = client.chat(chatRequest);
chatResponse.getResponse().ifPresent(
response -> handleFunctionCallback(chatBuilder, response));
...
}
private void handleFunctionCallback(final ChatRequest.Builder chatBuilder, final ChatResponse chatResponse) {
var responseMessage = chatResponse.getChoices().get(0).getMessage();
var functionCall = responseMessage.getFunctionCall();
var functionResponse = getFunctionResponse(functionCall);
....
The above code demonstrates how to handle a function callback within the runConversation() method. Let's break it down:
This code demonstrates how to handle function callbacks within the runConversation() method. It sends a chat request to the ChatGPT model, receives the response, and if a response is present, it invokes the handleFunctionCallback() method to handle the function callback. The function callback is extracted from the response message, and the getFunctionResponse() method is called to retrieve the function's response.
2.2 ChatGPT selects this function by returning a functionCall from a message
private final Map<String, Function<ObjectNode, String>> functionMap =
new HashMap<>();
/**
*
* Example dummy function hard coded to return the same weather for two cities and a default for other cities.
* In production, this could be your backend API or an external API
* @param objectNode arguments from chat GPT.
* @return JSON string
*/
public String getCurrentWeather(final ObjectNode objectNode) {
final String location = objectNode.getString("location");
final String unit = Optional.ofNullable(objectNode.get("unit"))
.map(node->node.asScalar().stringValue()).orElse("fahrenheit");
final JsonSerializer json = new JsonSerializer();
json.startObject();
json.addAttribute("location", location);
json.addAttribute("unit", unit);
switch (location) {
case "Austin, TX":
json.addAttribute("temperature", 92);
json.endObject();
return json.toString();
case "Boston, MA":
json.addAttribute("temperature", 72);
json.endObject();
return json.toString();
default:
json.addAttribute("temperature", 70);
json.endObject();
return json.toString();
}
}
public WeatherFunctionCallExample(OpenAIClient client) {
this.client = client;
functionMap.put("get_current_weather", this::getCurrentWeather); //register
}
private String getFunctionResponse(FunctionalCall functionCall) {
String functionResponse = "";
if (functionCall !=null && functionMap.containsKey(functionCall.getName())) {
functionResponse = functionMap
.get(functionCall.getName())
.apply(functionCall.getArguments());
}
return functionResponse;
}
The provided code snippet is an example of a weather-related function implementation within a larger ChatGPT conversation. Let's break it down:
Overall, this code demonstrates an example implementation of a weather-related function and its integration into a larger program using a functionMap to associate function names with their respective implementations for easy access to a ChatGPT conversation.
3 Call the ChatGPT again by appending the function response as a new message, and let the model summarize the results to the user.
private void handleFunctionCallback(final ChatRequest.Builder chatBuilder,
final ChatResponse chatResponse) {
var responseMessage = chatResponse.getChoices().get(0).getMessage();
var functionCall = responseMessage.getFunctionCall();
var functionResponse = getFunctionResponse(functionCall);
chatBuilder.addMessage(Message.builder().name(functionCall.getName()).role(Role.FUNCTION)
.content(functionResponse)
.build());
var response = client.chat(chatBuilder.build());
response.getResponse().ifPresent(chatResponse1 ->
System.out.println(chatResponse1.getChoices().get(0).getMessage().getContent()));
}
The handleFunctionCallback handles the function callback response received from the ChatGPT model. It retrieves the function call from the response, invokes the appropriate function to generate a response, adds the function response as a message in the chatBuilder sends the updated chatBuilder back to the model, and prints the subsequent response generated by the model to the console.Here's a breakdown of what the code does:
JSON TO/FRO
Let's look at the messages going to and fro ChatGPT to grok this example fully.
Call the ChatGPT chat with the user query and a set of functions defined in the function's parameter.
Here is the initial message that we send to Chat GPT:
{
"model":"gpt-3.5-turbo-0613",
"messages":[
{
"role":"user",
"content":"What's the weather like in Boston in fahrenheit?"
}
],
"function_call":"auto",
"functions":[
{
"name":"get_current_weather",
"parameters":{
"type":"object",
"properties":{
"location":{
"type":"string"
},
"unit":{
"type":"string",
"enum":[
"celsius",
"fahrenheit"
]
}
},
"required":[
"location"
]
}
}
]
}
Here is the response form ChatGPT
{
"id": "chatcmpl-7aWQG4KfBi7zLXkQXkGv0lp7YHGuu",
"object": "chat.completion",
"created": 1688938796,
"model": "gpt-3.5-turbo-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"function_call": {
"name": "get_current_weather",
"arguments": "{\\n \\"location\\": \\"Boston\\",\\n \\"unit\\": \\"fahrenheit\\"\\n}"
}
},
"finish_reason": "function_call"
}
],
"usage": {
"prompt_tokens": 61,
"completion_tokens": 24,
"total_tokens": 85
}
}
Notice that ChatGPT returned message that has a function_call defined. This means it wants you to invoke your function called get_current_weather.
From the functionaCall you call a function in your code with the arguments provided, and get a function response.
Call the ChatGPT again by appending the function response as a new message, and let the model summarize the results back to the user.
Here is the original ask with the function response mixed in.
{
"model":"gpt-3.5-turbo-0613",
"messages":[
{
"role":"user",
"content":"What's the weather like in Boston in fahrenheit?"
},
{
"role":"function",
"content":"{\\"location\\":\\"Boston\\",\\"unit\\":\\"fahrenheit\\",\\"temperature\\":70}",
"name":"get_current_weather"
}
]
...
}
Then lastly, ChatGPT mixes the results with the content/context returned from your weather function.
The final response from ChatGPT mixes our get_current_weather response.
{
"id": "chatcmpl-7aWQIWMzyaPDNtxJtmBQ43PYfEueh",
"object": "chat.completion",
"created": 1688938798,
"model": "gpt-3.5-turbo-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "The weather in Boston is currently 70 degrees Fahrenheit."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 81,
"completion_tokens": 12,
"total_tokens": 93
}
}
This developer notebook provides a comprehensive guide on implementing and integrating functions into a ChatGPT conversation with JAI. The first step is defining a function. This can be done by creating a dictionary that maps function names to their respective implementation. The implementation can be a simple function with arguments or a more complex function, such as one that makes API calls to retrieve data.
Next, we covered handling function callbacks. When ChatGPT calls a function, it returns a "function call" object containing the function's name and its arguments. The function call is extracted from the response message, and the appropriate function is invoked with the provided arguments. The function then generates a response, which is sent back to ChatGPT.
Once the function generates the response, we explain how to call the ChatGPT again by appending the function response as a new message. This allows ChatGPT to mix the results with the content/context returned from the function. The updated chat request is then sent back to the ChatGPT model, and the process continues until a final response is generated.
We also included an example implementation of a weather-related function and its integration into a larger program using a function map. The function map associates function names with their respective implementations for easy access in a ChatGPT conversation.
Overall, this developer notebook explains how to implement and integrate functions into a ChatGPT conversation, from defining a function to handling function callbacks and mixing the results with the content/context returned.
As artificial intelligence and chatbots continue to gain popularity, integrating functions into chat conversations is becoming increasingly important. Functions are essentially small units of code that can be reused and embedded into larger programs to perform a specific task. In this blog post, we will discuss how to implement and integrate functions into ChatGPT conversations using JAI, a Java Open AI API client. This guide will define a function, handle function callbacks, and mix function results with content/context returned from the function. Also, we'll be able to provide an example of a weather-related function and its integration into a larger program using a function map.
Related content:
Java Solutions Architect/Computer Vision Developer at Energosoft ITSS
11moNice article! Just to ask, not related to the tool you presented here, but a question of principle... I understand that, to let the model know it can call functions, we need to add a list of them to the payload (ie, JSON containing 'functions' array (with name, parameters etc...) Now, I guess that the model will decide to call function (ie, return the 'choice' with the 'function_call' property etc...) ONLY if the model cannot provide an answer in another way, that is, if a question is asked that is not in his knowledge base. I am right? if so, now let's imagine a realistic scenario, an application where an end user sends requests to GPT. Since we can't know what the user will ask, does this mean that we always have to send a list of functions that the model can decide to use? Thanks in advace,...