feat: user-defined custom prompts for Moonbeam interactions#203
feat: user-defined custom prompts for Moonbeam interactions#203
Conversation
Co-authored-by: sfreeman422 <16405652+sfreeman422@users.noreply.github.com> Agent-Logs-Url: https://github.com/dev-chat/mocker/sessions/914ee584-2937-48f5-b6b1-9827d6d84224
|
@copilot actually can you make this only apply to the participate code path and the prompt with history? The text path should stay with its current instructions |
…cipate and promptWithHistory Co-authored-by: sfreeman422 <16405652+sfreeman422@users.noreply.github.com> Agent-Logs-Url: https://github.com/dev-chat/mocker/sessions/e5dacb98-90b9-4ae4-9d41-7b25200e02e8
Done in b6c8f8a. Custom prompts now only apply to |
There was a problem hiding this comment.
Pull request overview
Adds support for per-user custom system prompts (“custom prompts”) that override Moonbeam’s default instructions for /prompt and @mention participation, with persistence in the SlackUser table and a new endpoint to manage the prompt.
Changes:
- Add nullable
SlackUser.customPromptcolumn and CRUD persistence service for custom prompts. - Apply custom prompts in
AIService.promptWithHistory()(prepend) andAIService.participate()(replace), and forward the triggering Slack user ID intoparticipate(). - Add
/ai/set-promptendpoint (registered before suppression/text/ai middleware) with controller + tests.
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| packages/backend/src/shared/db/models/SlackUser.ts | Adds customPrompt column for per-user AI instructions. |
| packages/backend/src/ai/user-prompt.persistence.service.ts | Implements DB read/update/clear for customPrompt. |
| packages/backend/src/ai/user-prompt.persistence.service.spec.ts | Adds unit tests for custom prompt persistence behavior. |
| packages/backend/src/ai/ai.service.ts | Fetches and applies customPrompt in promptWithHistory and participate; passes event user ID into participate. |
| packages/backend/src/ai/ai.service.spec.ts | Adds tests asserting custom prompt is used/falls back correctly. |
| packages/backend/src/ai/ai.controller.ts | Adds /set-prompt management endpoint before existing middlewares. |
| packages/backend/src/ai/ai.controller.spec.ts | Adds endpoint tests for viewing/setting/clearing custom prompts. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| const { user_id, team_id, text } = req.body; | ||
|
|
||
| if (!text) { | ||
| void userPromptPersistenceService.getCustomPrompt(user_id, team_id).then((prompt) => { | ||
| if (prompt) { | ||
| res.send(`Your current custom prompt: "${prompt}"`); | ||
| } else { | ||
| res.send( | ||
| 'You have no custom prompt set. Use `/set-prompt [prompt]` to set one, or `/set-prompt clear` to remove it.', | ||
| ); | ||
| } | ||
| }); | ||
| return; | ||
| } | ||
|
|
||
| if (text.toLowerCase() === 'clear') { | ||
| void userPromptPersistenceService.clearCustomPrompt(user_id, team_id).then((success) => { |
There was a problem hiding this comment.
text is used without normalization. A whitespace-only value (e.g. " ") will be treated as a real prompt and saved, and clear with surrounding whitespace (" clear ") won’t match the clear branch. Consider trimming once (e.g. const normalized = (text ?? '').trim()), treating empty-after-trim as “show current prompt”, and using the normalized value for the clear check + save.
There was a problem hiding this comment.
Fixed in d07dc43. text is now trimmed once up front ((text ?? '').trim()), whitespace-only values are rejected, and " CLEAR " correctly routes to the clear branch.
| void userPromptPersistenceService.setCustomPrompt(user_id, team_id, text).then((success) => { | ||
| if (success) { | ||
| res.send(`Your custom prompt has been set.`); | ||
| } else { | ||
| res.send('Failed to set your custom prompt. Please try again.'); | ||
| } |
There was a problem hiding this comment.
Because /set-prompt bypasses textMiddleware’s 800-char guard, users can persist very large prompts. That can cause DB bloat and downstream OpenAI request failures (token/size limits) when the prompt is later used as instructions. Consider enforcing an explicit maximum length for customPrompt here (with a user-friendly error message) before calling setCustomPrompt.
There was a problem hiding this comment.
Fixed in d07dc43. Added an explicit 800-character limit check in the handler with a user-friendly error message before any persistence call.
| const baseInstructions = customPrompt ?? MOONBEAM_SYSTEM_INSTRUCTIONS; | ||
| const systemInstructions = this.appendMemoryContext(baseInstructions, memoryContext); |
There was a problem hiding this comment.
customPrompt ?? MOONBEAM_SYSTEM_INSTRUCTIONS treats an empty string as a valid override, which can effectively wipe Moonbeam’s system instructions if a blank/whitespace prompt is stored. Consider normalizing via const normalized = customPrompt?.trim(); and only overriding when normalized is non-empty (otherwise fall back to MOONBEAM_SYSTEM_INSTRUCTIONS).
There was a problem hiding this comment.
Fixed in d07dc43. Both promptWithHistory and participate now do customPrompt?.trim() || null so blank/whitespace-stored values fall back to the default system instructions.
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
| import { UserPromptPersistenceService } from './user-prompt.persistence.service'; | ||
|
|
||
| export const aiController: Router = express.Router(); | ||
| aiController.use(suppressedMiddleware); |
There was a problem hiding this comment.
These should definitely still apply. Please add back.
There was a problem hiding this comment.
Done in d07dc43. suppressedMiddleware, textMiddleware, and aiMiddleware are now registered before all routes, including /set-prompt.
|
|
||
| const webService = new WebService(); | ||
| const aiService = new AIService(); | ||
| const userPromptPersistenceService = new UserPromptPersistenceService(); |
There was a problem hiding this comment.
I don't think we need a whole new service for this. Can these be moved into the user persistence service?
There was a problem hiding this comment.
You are also breaking a pattern here by having the ai.controller hit a persistence service directly. The AIService should be leveraging any persistence services.
There was a problem hiding this comment.
Done in d07dc43. getCustomPrompt, setCustomPrompt, and clearCustomPrompt are now methods on SlackPersistenceService. UserPromptPersistenceService has been deleted. The controller now calls aiService.setCustomPrompt/clearCustomPrompt — no persistence service is instantiated directly in the controller.
| const userPromptPersistenceService = new UserPromptPersistenceService(); | ||
| const aiLogger = logger.child({ module: 'AIController' }); | ||
|
|
||
| // /set-prompt does not require rate-limiting or text-length validation |
| const baseInstructions = customPrompt ?? MOONBEAM_SYSTEM_INSTRUCTIONS; | ||
| const systemInstructions = this.appendMemoryContext(baseInstructions, memoryContext); |
| void userPromptPersistenceService.setCustomPrompt(user_id, team_id, text).then((success) => { | ||
| if (success) { | ||
| res.send(`Your custom prompt has been set.`); | ||
| } else { | ||
| res.send('Failed to set your custom prompt. Please try again.'); | ||
| } |
| const { user_id, team_id, text } = req.body; | ||
|
|
||
| if (!text) { | ||
| void userPromptPersistenceService.getCustomPrompt(user_id, team_id).then((prompt) => { | ||
| if (prompt) { | ||
| res.send(`Your current custom prompt: "${prompt}"`); | ||
| } else { | ||
| res.send( | ||
| 'You have no custom prompt set. Use `/set-prompt [prompt]` to set one, or `/set-prompt clear` to remove it.', | ||
| ); | ||
| } | ||
| }); | ||
| return; | ||
| } | ||
|
|
||
| if (text.toLowerCase() === 'clear') { | ||
| void userPromptPersistenceService.clearCustomPrompt(user_id, team_id).then((success) => { |
…hrough AIService, restore middleware order Co-authored-by: sfreeman422 <16405652+sfreeman422@users.noreply.github.com> Agent-Logs-Url: https://github.com/dev-chat/mocker/sessions/ef34bcb8-cd7e-474b-913e-339106bccdb1
Users can now define a custom system prompt that overrides Moonbeam's default instructions when interacting via
/promptand direct @mentions.Schema
customPrompttext column toSlackUser(TypeORM sync applies automatically)Custom Prompt Persistence
CRUD methods (
getCustomPrompt,setCustomPrompt,clearCustomPrompt) are added toSlackPersistenceService— no separate service needed.setCustomPrompttrims and normalizes the prompt before persisting (whitespace-only values are stored asnull).New:
/ai/set-promptendpointSubject to the standard
suppressedMiddleware / textMiddleware / aiMiddlewarestack. Prompts are trimmed on receipt; whitespace-only values and prompts exceeding 800 characters are rejected with user-friendly error messages.clear(case-insensitive, whitespace-tolerant)AIServicechangesCustom prompt CRUD is exposed via public
setCustomPrompt/clearCustomPromptmethods onAIService, which delegates toSlackPersistenceService. The controller calls these service methods — no persistence service is instantiated directly in the controller.Custom prompts apply to the
participateandpromptWithHistorycode paths only.generateTextalways usesGENERAL_TEXT_INSTRUCTIONSunchanged. Both paths normalize the stored value (customPrompt?.trim() || null) before use to prevent blank overrides of the default system instructions.generateText()GENERAL_TEXT_INSTRUCTIONS(unaffected)promptWithHistory()participate(teamId, channelId, msg, userId?)MOONBEAM_SYSTEM_INSTRUCTIONSwhen sethandle()forwardsrequest.event.usertoparticipate()so tagging lookups resolve to the correct user's prompt.View original Slack conversation
📱 Kick off Copilot coding agent tasks wherever you are with GitHub Mobile, available on iOS and Android.