Prompt Templates
Prerequisites: Before reading this guide, we recommend familiarizing yourself with the system prompt configuration in Bot Management.
Monstrum’s prompt template system lets you customize the prompts a Bot uses in different scenarios, with support for workspace-level and Bot-level overrides.
Overview
When a Bot processes each message, the platform automatically assembles a complete system prompt. This prompt is not a single fixed block of text — it is composed of multiple layers of content stitched together. Prompt templates control the foundational layer: the Bot’s core behavioral instructions.
Prompts are resolved through a three-tier priority chain:
Bot-level override > Workspace-level template > System default
- If a prompt is set at the Bot level, the Bot’s version is used
- Otherwise, the workspace-level version is used
- If the workspace has not set one either, the system default is used
9 Prompt Keys
The platform manages 9 prompt templates covering all aspects of Bot operation:
Core Prompts
| Key | Purpose | Supported Variables |
|---|---|---|
default_task_system | System prompt for task mode | {bot_name}, {bot_description} |
default_session_system | System prompt for session mode | {bot_name}, {bot_description} |
These two are the most critical prompts, defining the Bot’s fundamental behavior in tasks and sessions.
Scenario Prompts
| Key | Purpose |
|---|---|
group_chat | Additional prompt for group chats (injected only in session-mode group chat scenarios) |
planning | Instructions for planning reasoning mode |
adaptive | Instructions for adaptive reasoning mode |
Internal Prompts
| Key | Purpose | Supported Variables |
|---|---|---|
memory_extraction | Memory extraction prompt | {current_section}, {conversation} |
memory_extraction_system | System role for memory extraction | — |
conversation_summary | Conversation compression prompt | {conversation_text} |
conversation_summary_system | System role for conversation compression | — |
Internal prompts are used by the platform’s automated features (memory extraction, conversation compression) and typically do not need modification.
System Prompt Assembly Order
The final system prompt used by a Bot is composed of the following parts in order:
Session Mode
- Base template: The Bot’s custom system prompt (or the
default_session_systemtemplate) - Group chat instructions: The
group_chatprompt (appended only in group chats) - Available resources:
## Available Resources— auto-generated summary of resources and tools - Bot memories:
## Bot Memories— memory content organized by scope - Skill instructions:
## Skills— content from enabled Skills
Task Mode
- Base template: The Bot’s custom system prompt (or the
default_task_systemtemplate) - Available resources:
## Available Resources - Bot memories:
## Bot Memories - Reasoning mode:
planningoradaptiveinstructions (if a reasoning mode is configured) - Skill instructions:
## Skills
The “System Prompt” field in Bot settings replaces the base template in step 1. However, the content in steps 2-5 is always appended automatically and is not affected.
Editing Workspace-Level Prompts
- Click Prompt Templates in the left navigation bar
- The page lists all 9 prompt keys
- Each key shows its current state: System Default or Customized
- Click the edit button to modify the prompt content
- Save
Modifying a workspace-level prompt affects all Bots that use the default value (those without a Bot-level override).
Reset to Default
Click the Reset to Default button to restore the workspace-level prompt to the system default.
Editing Bot-Level Prompts
- Navigate to the Bot detail page and select the Prompts tab
- The page lists all 9 prompt keys
- Each key shows its current source: System Default / Workspace / Bot Custom
- Click the edit button to modify the prompt content
- Save
Bot-level prompts have the highest priority. Once set, the Bot will no longer use the workspace or system default version.
Reset to Workspace Default
Click the Reset to Workspace Default button to clear the Bot-level override and fall back to the workspace-level or system default value.
Variable Support
Some prompt templates support variable substitution:
| Variable | Available In | Description |
|---|---|---|
{bot_name} | Task/session system prompts | The Bot’s name |
{bot_description} | Task/session system prompts | The Bot’s description |
{current_section} | Memory extraction prompt | Current memory content |
{conversation} | Memory extraction prompt | Conversation history |
{conversation_text} | Conversation compression prompt | Conversation text to be compressed |
Common Use Cases
Standardizing Bot Style
Set default_session_system at the workspace level so all Bots use a consistent response style:
You are {bot_name}, {bot_description}.
Please reply in Chinese and maintain a professional, concise style. Avoid using emojis in your responses.
Improving Memory Extraction Quality
Customize the memory_extraction prompt to fine-tune the memory extraction rules:
Extract important information from the following conversation. Focus on:
1. Preferences explicitly expressed by the user
2. Technical decisions related to the project
3. Conventions that need to be remembered long-term
Current memories:
{current_section}
Conversation content:
{conversation}
Customizing Group Chat Behavior
Modify the group_chat prompt to control Bot behavior in group chats:
You are in a group chat. Please note:
- Only respond to messages relevant to you
- Do not repeat what others have already said
- Keep your responses brief
FAQ
Bot behavior hasn’t changed after modifying prompts
- Existing active sessions use the prompts from when they were created. Start a new conversation to use the updated prompts
- Check whether you modified the workspace-level or Bot-level prompt. If the Bot already has a custom prompt, modifying the workspace-level prompt will have no effect
Not sure which prompt is currently in use
In the Bot detail page under the Prompts tab, each key is labeled with its source (System Default / Workspace / Bot Custom).
Long prompts causing high token consumption
The longer the system prompt, the more tokens each LLM call consumes. Recommendations:
- Keep the base template concise
- Put detailed instructions in Skills and enable them on demand
- Periodically clean up unneeded memories (memories also consume prompt space)