Installation
Install the Simforge Python SDK using pip:
Or with Poetry:
Optional Dependencies
For OpenAI Agents SDK tracing support:
pip install simforge-py[openai-tracing]
Quick Start
import os
from simforge import Simforge
# Initialize the client
client = Simforge(
api_key=os.environ["SIMFORGE_API_KEY"],
env_vars={
"OPENAI_API_KEY": os.environ["OPENAI_API_KEY"],
},
)
# Call a function
result = client.call("ExtractName", text="My name is John Doe")
print(result)
# ExtractNameOutput(firstName="John", lastName="Doe")
# Access as dict
print(result.model_dump())
# {"firstName": "John", "lastName": "Doe"}
Currently, only OpenAI is supported as an LLM provider. Support for additional providers is coming soon.
Configuration
Constructor Options
| Option | Type | Required | Description |
|---|
api_key | str | Yes | Your Simforge API key |
env_vars | dict[str, str] | No | Environment variables to pass to BAML execution |
Environment Variables
Pass your OpenAI API key to enable BAML function execution:
client = Simforge(
api_key=os.environ["SIMFORGE_API_KEY"],
env_vars={
"OPENAI_API_KEY": os.environ["OPENAI_API_KEY"],
},
)
Currently, only OPENAI_API_KEY is supported. Additional LLM providers will be added in future releases.
Calling Functions
Basic Call
result = client.call("FunctionName", input_field="value")
With Keyword Arguments
result = client.call(
"ExtractPerson",
text="John Doe is a software engineer from San Francisco",
include_location=True,
)
Accessing Results
Results are returned as Pydantic models:
result = client.call("ExtractPerson", text="...")
# Access fields directly
print(result.firstName)
print(result.lastName)
# Convert to dict
print(result.model_dump())
# Convert to JSON
print(result.model_dump_json())
Tracing with OpenAI Agents SDK
The Simforge SDK integrates with the OpenAI Agents SDK to automatically capture traces:
import os
from agents import Agent, Runner, set_trace_processors, trace
from simforge import Simforge
# Create Simforge client
client = Simforge(
api_key=os.environ["SIMFORGE_API_KEY"]
)
# Get the tracing processor
processor = client.get_openai_tracing_processor()
# Register with OpenAI Agents SDK
set_trace_processors([processor])
# Create an agent
agent = Agent(
name="My Agent",
instructions="You are a helpful assistant",
model="gpt-4o",
)
# Run with tracing
with trace("My Workflow", metadata={"environment": "production"}):
result = Runner.run_sync(agent, "Hello!")
Error Handling
from simforge import Simforge, SimforgeError
try:
result = client.call("FunctionName", **inputs)
except SimforgeError as e:
print(f"Simforge error: {e.message}")
print(f"Status: {e.status}")
except Exception as e:
raise
Examples
result = client.call(
"ExtractInvoice",
text="""
Invoice #12345
Date: 2024-01-15
Total: $150.00
Items:
- Widget x2 @ $50.00
- Gadget x1 @ $50.00
""",
)
print(result.model_dump())
# {
# "invoiceNumber": "12345",
# "date": "2024-01-15",
# "total": 150.00,
# "items": [
# {"name": "Widget", "quantity": 2, "price": 50.00},
# {"name": "Gadget", "quantity": 1, "price": 50.00}
# ]
# }
Classify Text
result = client.call(
"ClassifySentiment",
text="I love this product! It's amazing!",
)
print(result.sentiment) # "positive"
print(result.confidence) # 0.95
Async Support
import asyncio
from simforge import Simforge
async def main():
client = Simforge(api_key=os.environ["SIMFORGE_API_KEY"])
# Async call
result = await client.call_async("ExtractName", text="John Doe")
print(result)
asyncio.run(main())