Quickstart
Get a working example running in minutes.
Option 1: Create an Agent (Recommended)
The trik create-agent command scaffolds a complete agent project with LangGraph integration and gateway setup pre-configured.
1. Install the CLI
npm install -g @trikhub/cli2. Scaffold an Agent
trik create-agent ts
cd my-agent
npm install3. Install a Trik
# Optional Trik to manage triks directly through CLI
trik install @molefas/trikster
# Demo trik
trik install @molefas/trik-demo-notes4. Run
npm run devThat’s it — you have a working agent with trik integration. The scaffolded project includes a conversation loop, so you can start chatting right away.
Option 2: Add to Existing Project
1. Install Dependencies
npm install -g @trikhub/cli
npm install @trikhub/gatewayThe @trikhub/gateway package includes everything you need to load and run triks. You do not need to install @trikhub/manifest separately — it is included as a transitive dependency.
2. Install a Trik
# Optional Trik to manage triks directly through CLI
trik install @molefas/trikster
# Demo trik
trik install @molefas/trik-demo-notesThis checks the TrikHub registry, adds the trik to your package.json via a git URL, and registers it in .trikhub/config.json.
3. Integrate with Your Agent
The fastest way to integrate is using the LangChain adapter’s enhance() function, which wraps your LangGraph agent with handoff routing:
import { createReactAgent } from '@langchain/langgraph/prebuilt';
import { ChatOpenAI } from '@langchain/openai';
import { enhance } from '@trikhub/gateway/langchain';
const model = new ChatOpenAI({ model: 'gpt-4o-mini' });
// enhance() creates the gateway, loads triks, and manages the agent lifecycle
const app = await enhance(null, {
createAgent: (trikTools) =>
createReactAgent({
llm: model,
tools: [...yourOwnTools, ...trikTools],
}),
});
const response = await app.processMessage('Find me some AI articles');
console.log(response.message); // The trik's response
console.log(response.source); // "main", trik ID, or "system"If you need more control over gateway setup:
import { createReactAgent } from '@langchain/langgraph/prebuilt';
import { ChatOpenAI } from '@langchain/openai';
import { enhance } from '@trikhub/gateway/langchain';
import { TrikGateway } from '@trikhub/gateway';
const gateway = new TrikGateway();
await gateway.initialize();
await gateway.loadTriksFromConfig();
const model = new ChatOpenAI({ model: 'gpt-4o-mini' });
const app = await enhance(null, {
gatewayInstance: gateway,
createAgent: (trikTools) =>
createReactAgent({
llm: model,
tools: [...yourOwnTools, ...trikTools],
}),
});
const response = await app.processMessage('Find me some AI articles');4. Build a Conversation Loop
The processMessage method handles the full routing lifecycle. Here is a minimal conversation loop:
import * as readline from 'node:readline';
const rl = readline.createInterface({ input: process.stdin, output: process.stdout });
function ask(prompt: string): Promise<string> {
return new Promise((resolve) => rl.question(prompt, resolve));
}
while (true) {
const userMessage = await ask('You: ');
if (userMessage === 'exit') break;
const response = await app.processMessage(userMessage);
console.log(`[${response.source}]: ${response.message}`);
}What Happens
- User sends a message
- The gateway checks if there is an active handoff — if so, the message goes directly to the trik agent
- If no handoff is active, the message goes to your main agent
- Your main agent sees
talk_to_<trik>tools and exposed tools in its tool set - If the agent calls a handoff tool, the gateway starts a handoff session with the trik
- The trik handles the conversation autonomously until it transfers back
- On transfer-back, a structured session summary is injected into the main agent’s history
Next Steps
- Learn about the CLI commands
- Understand the LangChain integration in depth
- Explore the Gateway API for custom integrations
- Learn how to Create Triks