Tool/Function Calling
AISuite makes tool calling incredibly simple. Just pass your functions directly and let AISuite handle the execution automatically.
✨ Key Innovation
Unlike traditional approaches that require manual tool handling, AISuite's max_turns
parameter enables automatic tool execution:
- ✓Pass Python/JS functions directly - no JSON schema needed
- ✓Automatic execution loop with
max_turns
- ✓Access all intermediate tool calls via
intermediate_messages
- ✓Works with all supported providers
Traditional Approach vs AISuite
Traditional: Manual JSON Schema
Most libraries require you to define tools as JSON schemas:
def will_it_rain(location: str, time_of_day: str):
"""Check if it will rain in a location at a given time today.
Args:
location (str): Name of the city
time_of_day (str): Time of the day in HH:MM format.
"""
return "YES"
# Traditional approach - manual tool handling
tools = [{
"type": "function",
"function": {
"name": "will_it_rain",
"description": "Check if it will rain in a location at a given time today",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "Name of the city"},
"time_of_day": {"type": "string", "description": "Time in HH:MM format"}
},
"required": ["location", "time_of_day"]
}
}
}]
AISuite: Automatic with Functions
With AISuite, just pass your functions directly:
import aisuite as ai
def will_it_rain(location: str, time_of_day: str):
"""Check if it will rain in a location at a given time today.
Args:
location (str): Name of the city
time_of_day (str): Time of the day in HH:MM format.
"""
# Your actual implementation here
return "YES"
client = ai.Client()
messages = [{
"role": "user",
"content": "I live in San Francisco. Can you check for weather "
"and plan an outdoor picnic for me at 2pm?"
}]
# AISuite handles everything automatically!
response = client.chat.completions.create(
model="openai:gpt-4o",
messages=messages,
tools=[will_it_rain], # Just pass the function directly!
max_turns=2 # Enable automatic execution
)
print(response.choices[0].message.content)
Using Multiple Tools
Pass multiple tools and AISuite will orchestrate them automatically:
import aisuite as ai
def get_weather(location: str):
"""Get current weather for a location."""
return {"temp": 72, "condition": "sunny"}
def search_restaurants(location: str, cuisine: str):
"""Search for restaurants by location and cuisine."""
return [{"name": "Pizza Place", "rating": 4.5}]
def book_reservation(restaurant: str, time: str, party_size: int):
"""Book a restaurant reservation."""
return {"confirmed": True, "reservation_id": "12345"}
client = ai.Client()
# Pass multiple tools - AISuite will use them as needed
response = client.chat.completions.create(
model="anthropic:claude-3-5-sonnet-20240620",
messages=[{
"role": "user",
"content": "Find and book a Italian restaurant for 4 people at 7pm in NYC"
}],
tools=[get_weather, search_restaurants, book_reservation],
max_turns=3 # Allow multiple tool calls
)
# Access intermediate tool calls and responses
for msg in response.choices[0].intermediate_messages:
if hasattr(msg, 'tool_calls'):
print(f"Called: {msg.tool_calls[0].function.name}")
How It Works
- 1Function Analysis: AISuite automatically extracts function signatures and docstrings to create tool specifications
- 2LLM Request: Your message and tool specs are sent to the LLM
- 3Tool Execution: When the LLM requests a tool, AISuite automatically executes it
- 4Result Return: Tool results are sent back to the LLM
- 5Loop: Steps 3-4 repeat up to
max_turns
times
Continuing Conversations
Access the full conversation history including tool interactions:
# Continue conversation with tool context
messages.extend(response.choices[0].intermediate_messages)
messages.append({
"role": "user",
"content": "Actually, change it to 8pm instead"
})
# The model has context of previous tool calls
response2 = client.chat.completions.create(
model="openai:gpt-4o",
messages=messages,
tools=[book_reservation],
max_turns=1
)
Best Practices
📝 Write Clear Docstrings
AISuite uses your function docstrings to help the LLM understand when and how to use tools. Include clear descriptions and parameter documentation.
🔄 Set Appropriate max_turns
Use 1-2 turns for simple tools, 3-5 for complex workflows. Higher values allow more tool interactions but increase latency and cost.
⚡ Handle Errors Gracefully
Return informative error messages from your tools. The LLM can often recover and try alternative approaches.
🎯 Keep Tools Focused
Design tools to do one thing well. Multiple focused tools are better than one complex tool.