feat(companion): chat on runPlannerLoop with native function calling

The companion chat had its own ad-hoc 3-round tool-calling pipeline:
build a system prompt with tool descriptions, ask the LLM to emit
```tool JSON blocks, regex-extract, execute, feed back the result as
a synthetic user message. Same fragility class as the old text-JSON
planner — and now unnecessary since mana-llm speaks native function
calling.

Migrates companion/engine.ts to the shared runPlannerLoop, same as
the mission runner (commit 5a) and the server tick (commit 6). Tools
go to the LLM as proper function-schemas; tool_calls come back
structured; the executor runs them directly under USER_ACTOR.

Extends shared-ai/planner/loop.ts with an optional priorMessages[]
input field so the chat can preserve multi-turn history between
turns (missions don't need this and leave it empty).

Deletes the old llm-tasks/companion-chat.ts LlmTask wrapper. Nothing
else imported it.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Till JS 2026-04-20 16:45:33 +02:00
parent 80dbb3b3b6
commit 9f7d2f24b3
3 changed files with 105 additions and 272 deletions

View file

@ -63,6 +63,10 @@ export interface LlmClient {
export interface PlannerLoopInput {
readonly systemPrompt: string;
readonly userPrompt: string;
/** Optional prior conversation turns inserted between the system
* prompt and the new user turn. Used by the companion chat to
* preserve multi-turn history; missions leave this empty. */
readonly priorMessages?: readonly ChatMessage[];
readonly tools: readonly ToolSchema[];
readonly model: string;
readonly temperature?: number;
@ -113,6 +117,7 @@ export async function runPlannerLoop(opts: {
const messages: ChatMessage[] = [
{ role: 'system', content: input.systemPrompt },
...(input.priorMessages ?? []),
{ role: 'user', content: input.userPrompt },
];
const executedCalls: ExecutedCall[] = [];