Seamlessly integrate LLMs in application code

Turn LLM prompts into typed functions, and call it like any other application code. Nuabase enforces the schema, retries failures, caches rows granularly, and tracks your per-customer token cost. It beams structured JSON results directly to your front-end using Server Sent Events, and delivers them to your back-end through webhooks.

TypeScript
import { z } from "zod";
import { Nuabase } from "@nuabase/sdk";

const nua = new Nuabase({ apiKey: process.env.NUABASE_API_KEY });

const summarizeThread = nua.fn({
  name: "summarizeSupportThread",
  output: z.object({
    summary: z.string(),
    nextAction: z.string(),
    priority: z.enum(["low", "normal", "urgent"]),
  }),
  prompt: "Summarize the conversation and suggest the next action. Thread: ",
});

const result = await summarizeThread({ body: ticket.comments });

// result: { summary: string; nextAction: string; priority: "low" | "normal" | "urgent" }

Drop it into any service, stream progress to the UI, and rely on the contract every call.

Ship real features in minutes

Support triage

Summaries, tags, and priority suggestions arrive back as validated JSON. No parsing branches or prompt gymnastics.

Lead enrichment

Convert free-form notes into firmographics and recommended follow-ups that drop straight into your CRM.

Content moderation

Flag edge cases with consistent verdicts and reasoning fields that your automation can consume immediately.

Document intelligence

Pipe outlines, highlights, and extracted fields straight to your UI with progress streaming along the way.

Nuabase at a glance

Create a typed function once, call it from anywhere. Nuabase SDKs expose a regular async function whose return value already matches your schema. No parsing, no JSON juggling, just domain objects.

1. Pick the inputs.

Decide the rows, tickets, or documents you want enriched and pass them in as plain data.

const leads = [
  { id: "lead-101", notes: "Growth-stage SaaS, ~80 employees, wants a demo." },
];
2. Shape the result.

Define the fields and enums you expect so every run returns the same structure.

const LeadInsights = z.object({
  industry: z.string(),
  company_size: z.enum(["SMB", "Mid-market", "Enterprise"]),
});
3. Describe the job.

Write the instructions as an LLM prompt — Nuabase turns it into a callable function that lives alongside your code.

const classifyLeads = new Nua().map({
  name: "insights",
  output: LeadInsights,
  prompt: "Classify each lead with industry and company_size bucket.",
});
4. Invoke and iterate.

Call it like any async function, watch streaming progress, and ship the feature without custom orchestration.

const result = await classifyLeads.now(leads, "id");
if (result.isError) throw new Error(result.error);
console.log(result.data[0].insights);
// -> { industry: "Software", company_size: "Mid-market" }

The plumbing you don't want to rebuild

Schema enforcement.

Each response is validated before you see it—failed rows are retried automatically.

Async jobs.

Long tasks move to managed queues with webhooks, backoff, and replay support.

Streaming.

Server-sent events keep the UI updated while the model runs.

Caching and attribution.

Cache hits, cost per customer, and usage metrics ship with every response.

Why use Nuabase

  • Type-safe from prompt to runtime. Schemas compile to strict validation, so downstream code never guesses at field names.
  • Zero glue code. Queues, retries, SSE, webhooks, and polling are baked in—no custom workers to maintain.
  • Production visibility. Every request lands in the console with logs, replay, and tracing IDs for quick debugging.
  • Dependable ergonomics. Nuabase delivers dependable LLM calls with the ergonomics of regular application code, so your team ships faster without orchestration churn.

Start with TypeScript, or call the HTTP API directly

We currently have a TypeScript SDK that provides well-typed outputs using Zod, as well as an HTTP API that you can invoke with JSON Schema. If you need support for other languages, please let us know.