08Oct


Large Language Model (LLM) Function Calling enables models to interact directly with external functions and APIs, expanding their utility beyond language processing.

Before diving into demystifying LLM function calling, just a few considerations…

The term Large Language Model is increasingly seen as a general reference rather than a precise or technically accureate description.

Today, the term Foundation Models encompass a broader range of capabilities, including not only language but also vision and multimodal functionalities.

There are also specialised models like Small Language Models optimised for lightweight applications and Large Action Models, which are fine-tuned for structured outputs and agent-based tasks.

This evolution reflects the diversity in AI architectures, with models designed to meet specific needs across various domains and applications. As the landscape grows, terminology will likely continue to evolve.

When using the OpenAI API with function calling, the model itself does not run the functions.

Instead, it generates parameters for potential function calls.

Your application then decides how to handle these parameters, maintaining full control over whether to call the suggested function or take another action.

In AI language models, the introduction of functions adds a new layer of autonomy.

The function calling capability allows the model to independently determine whether a function call is needed to handle a particular task or if it should respond directly.

By doing so, the model dynamically selects the most suitable response strategy based on the context, enhancing both its adaptability and effectiveness.

This decision-making power introduces a more nuanced autonomy, enabling the model to switch seamlessly between execution and conversation.

In function calling with language models, the model operates autonomously to determine whether a specific function call is appropriate based on the request.

When it identifies a match, it transitions to a more structured approach, preparing data parameters needed for the function. This allows the language model to act as a mediator, enabling efficient function handling while maintaining flexibility in processing the request.

AI autonomy can be viewed on a spectrum, with varying levels of independence depending on the system’s design.

By integrating function calls within generative AI applications, we introduce not only structure but also an initial layer of autonomy.

This enables AI systems to assess and respond to specific requests with a degree of self-direction. As AI technology evolves, these levels of autonomy are expected to increase, allowing models to handle tasks with greater independence and sophistication.

Consequently, this progression will enhance AI’s capacity to manage complex functions autonomously.

From the Python application below, it can be seen that two functions are defined, for adding and another for subtracting.

These functions need not be as confined as in this simple illustrative example, it can break out to an API which is external.

You also see the schema which is defined for the functions and a description for each schema, together with a description for each input parameter.

pip install openai==0.28

import openai
import json

# Prompt user to input API key
api_key = input("Please enter your OpenAI API key: ")
openai.api_key = api_key

# Define the tools: an addition function and a subtraction function
def add_numbers(a, b):
return {"result": a + b}

def subtract_numbers(a, b):
return {"result": a - b}

# Define the function schema for OpenAI function calling
functions = [
{
"name": "add_numbers",
"description": "Add two numbers together",
"parameters": {
"type": "object",
"properties": {
"a": {
"type": "number",
"description": "The first number to add"
},
"b": {
"type": "number",
"description": "The second number to add"
}
},
"required": ["a", "b"]
}
},
{
"name": "subtract_numbers",
"description": "Subtract one number from another",
"parameters": {
"type": "object",
"properties": {
"a": {
"type": "number",
"description": "The number to subtract from"
},
"b": {
"type": "number",
"description": "The number to subtract"
}
},
"required": ["a", "b"]
}
}
]

# Define a function to handle the function calling based on the function name
def handle_function_call(function_name, arguments):
if function_name == "add_numbers":
return add_numbers(arguments['a'], arguments['b'])
elif function_name == "subtract_numbers":
return subtract_numbers(arguments['a'], arguments['b'])
else:
raise ValueError(f"Unknown function: {function_name}")

# Prompting the model with function calling
def call_gpt(prompt):
response = openai.ChatCompletion.create(
model="gpt-4-0613", # gpt-4-0613 supports function calling
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt}
],
functions=functions,
function_call="auto" # This allows the model to decide which function to call
)

# Prompting the model with function calling
def call_gpt(prompt):
response = openai.ChatCompletion.create(
model="gpt-4-0613", # gpt-4-0613 supports function calling
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt}
],
functions=functions,
function_call="auto" # This allows the model to decide which function to call
)

# Check if the model wants to call a function
message = response["choices"][0]["message"]
if "function_call" in message:
function_name = message["function_call"]["name"]
arguments = json.loads(message["function_call"]["arguments"])
result = handle_function_call(function_name, arguments)
print (function_name, arguments, result)
return f"Function called: {function_name}, Result: {result['result']}"
else:
return message["content"]

# Test the app
while True:
user_input = input("Enter a math problem (addition or subtraction) or 'exit' to quit: ")
if user_input.lower() == "exit":
break
response = call_gpt(user_input)
print(response)

Chief Evangelist @ Kore.ai | I’m passionate about exploring the intersection of AI and language. From Language Models, AI Agents to Agentic Applications, Development Frameworks & Data-Centric Productivity Tools, I share insights and ideas on how these technologies are shaping the future.

https://platform.openai.com/docs/guides/function-calling



Source link

Protected by Security by CleanTalk