LLM Integration
How to integrate molroo-api with OpenAI, Anthropic, and other LLM providers.
LLM Integration
molroo-api computes emotional state -- your LLM generates the actual response. The API now returns prompt_data.formatted, a ready-to-use text block you can drop directly into your LLM's system prompt. No more manually constructing emotion descriptions.
Architecture Overview
Step 1: Create a Session
Start by creating a session with a preset or custom persona. Presets give you a ready-made character to start with:
curl
JavaScript
Step 2: Process a Turn
Send the user's message to molroo-api. The simplified input only requires sessionId and message:
curl
JavaScript
The response includes prompt_data.formatted -- a complete, pre-built system prompt block:
Step 3: Call the LLM
Use prompt_data.formatted directly as your system prompt. No manual prompt construction needed.
OpenAI
Anthropic
Customizing the System Prompt
If you need to add your own instructions alongside the emotional context, prepend or append to prompt_data.formatted:
Complete Example
A full integration putting all steps together:
With Anthropic
Using Raw prompt_data Fields
If you prefer to build your own system prompt, the individual fields are also available:
Tips
- Use
prompt_data.formattedby default. It is designed to produce the best LLM behavior. Only build custom prompts if you have specific requirements. - Keep emotion injection subtle. Characters should not say "I feel joy at 0.72 intensity." The LLM should naturally express the emotion through tone and word choice.
- Handle session lifecycle. Create a new session for each conversation. Sessions persist state across turns automatically.
- Fetch full state sparingly. The turn endpoint returns enough data for most use cases. Only call
GET /v1/state/{sessionId}when you need the full picture (body budget, interpersonal dynamics, etc.). - For real-time apps, consider WebSocket connections instead of REST polling.