Function Calling
Function Calling
Section titled “Function Calling”Function calling (tool use) lets AI models invoke external functions. AI-Lib provides a unified interface for tool calling across all providers that support it.
Defining Tools
Section titled “Defining Tools”use ai_lib_rust::ToolDefinition;use serde_json::json;
let get_weather = ToolDefinition { name: "get_weather".into(), description: Some("Get current weather for a city".into()), parameters: json!({ "type": "object", "properties": { "city": { "type": "string", "description": "City name" }, "unit": { "type": "string", "enum": ["celsius", "fahrenheit"] } }, "required": ["city"] }),};Python
Section titled “Python”get_weather = { "name": "get_weather", "description": "Get current weather for a city", "parameters": { "type": "object", "properties": { "city": { "type": "string", "description": "City name", }, "unit": { "type": "string", "enum": ["celsius", "fahrenheit"], }, }, "required": ["city"], },}TypeScript
Section titled “TypeScript”import { ToolDefinition } from '@hiddenpath/ai-lib-ts';
const getWeather: ToolDefinition = { name: 'get_weather', description: 'Get current weather for a city', parameters: { type: 'object', properties: { city: { type: 'string', description: 'City name', }, unit: { type: 'string', enum: ['celsius', 'fahrenheit'], }, }, required: ['city'], },};Non-Streaming Tool Calls
Section titled “Non-Streaming Tool Calls”let response = client.chat() .user("What's the weather in Tokyo?") .tools(vec![get_weather]) .execute() .await?;
for call in &response.tool_calls { println!("Function: {}", call.name); println!("Arguments: {}", call.arguments); // Execute the function and send results back}Python
Section titled “Python”response = await client.chat() \ .user("What's the weather in Tokyo?") \ .tools([get_weather]) \ .execute()
for call in response.tool_calls: print(f"Function: {call.name}") print(f"Arguments: {call.arguments}")TypeScript
Section titled “TypeScript”const response = await client .chat() .user("What's the weather in Tokyo?") .tools([getWeather]) .execute();
for (const call of response.toolCalls) { console.log(`Function: ${call.name}`); console.log(`Arguments: ${call.arguments}`);}Streaming Tool Calls
Section titled “Streaming Tool Calls”Tool calls stream as partial events that the pipeline’s Accumulator assembles:
let mut stream = client.chat() .user("What's the weather?") .tools(vec![get_weather]) .stream() .execute_stream() .await?;
while let Some(event) = stream.next().await { match event? { StreamingEvent::ToolCallStarted { name, id, .. } => { println!("Starting tool: {name} (id: {id})"); } StreamingEvent::PartialToolCall { arguments, .. } => { print!("{arguments}"); // Partial JSON arguments } StreamingEvent::ToolCallEnded { id, .. } => { println!("\nTool call {id} complete"); } StreamingEvent::ContentDelta { text, .. } => { print!("{text}"); } _ => {} }}Python
Section titled “Python”async for event in client.chat() \ .user("What's the weather?") \ .tools([get_weather]) \ .stream(): if event.is_tool_call_started: call = event.as_tool_call_started print(f"Starting: {call.name}") elif event.is_partial_tool_call: print(event.as_partial_tool_call.arguments, end="") elif event.is_content_delta: print(event.as_content_delta.text, end="")TypeScript
Section titled “TypeScript”for await (const event of client .chat() .user("What's the weather?") .tools([getWeather]) .stream()) { if (event.isToolCallStarted) { const call = event.asToolCallStarted; console.log(`Starting: ${call.name}`); } else if (event.isPartialToolCall) { process.stdout.write(event.asPartialToolCall.arguments); } else if (event.isContentDelta) { process.stdout.write(event.asContentDelta.text); }}How It Works
Section titled “How It Works”- You define tools and pass them in the request
- The protocol manifest maps
toolsto the provider-specific format - The model decides to call a tool (or respond with text)
- For streaming, the pipeline’s Accumulator assembles partial tool call chunks
- You receive unified
ToolCallStarted,PartialToolCall, andToolCallEndedevents
Provider Support
Section titled “Provider Support”Check the provider’s capabilities before using tools:
| Provider | Tool Calling |
|---|---|
| OpenAI | Supported |
| Anthropic | Supported |
| Gemini | Supported |
| DeepSeek | Supported |
| Groq | Supported |
| Mistral | Supported |
| Qwen | Supported |
The manifest’s capabilities.tools: true flag indicates support.