Skip to main content

What is Simforge?

Simforge is a platform that helps you build, test, and deploy LLM-powered functions. It provides:
  • Function Management: Define and version your LLM functions with BAML
  • Tracing: Monitor and debug your LLM calls with detailed traces
  • Testing: Run test suites against your functions to ensure quality
  • SDKs: Call your functions from TypeScript or Python applications

Quick Start

How It Works

  1. Define Functions: Create LLM functions in the Simforge web portal using BAML syntax
  2. Get API Keys: Generate API keys from the settings page
  3. Install SDK: Add the Simforge SDK to your project
  4. Call Functions: Use the SDK to call your functions with type-safe inputs and outputs
import { Simforge } from "@goharvest/simforge"

const client = new Simforge({
  apiKey: process.env.SIMFORGE_API_KEY,
  envVars: {
    OPENAI_API_KEY: process.env.OPENAI_API_KEY,
  },
})

const result = await client.call("ExtractName", {
  text: "My name is John Doe",
})

console.log(result) // { firstName: "John", lastName: "Doe" }
Currently, only OpenAI is supported as an LLM provider.

Architecture

Simforge uses BAML (Basically A Made-up Language) to define LLM functions. BAML provides:
  • Type-safe inputs and outputs
  • Prompt templating
  • Provider configuration
  • Automatic retry and fallback logic
Your BAML functions are executed locally by the SDK, giving you full control over your LLM calls while Simforge handles versioning, testing, and monitoring.