Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/getcompanion-ai/feynman/llms.txt

Use this file to discover all available pages before exploring further.

Feynman stores all configuration and state under ~/.feynman/. This directory is created on first run and contains settings, authentication tokens, session history, and installed packages.

Directory structure

~/.feynman/
├── settings.json       # Core configuration: model, thinking level, packages
├── auth.json           # OAuth tokens and API keys
├── sessions/           # Persisted conversation history
└── packages/           # Installed optional packages
The settings.json file is the primary configuration file. It is created and maintained by feynman setup and feynman model set, and can be edited manually. Do not edit it while Feynman is running.

settings.json

A typical settings.json looks like:
{
  "defaultProvider": "anthropic",
  "defaultModel": "claude-sonnet-4-5",
  "defaultThinkingLevel": "medium",
  "packages": [
    "npm:pi-subagents",
    "npm:pi-btw",
    "npm:pi-docparser",
    "npm:pi-web-access",
    "npm:pi-markdown-preview",
    "npm:@walterra/pi-charts",
    "npm:pi-mermaid",
    "npm:@aliou/pi-processes",
    "npm:pi-zotero",
    "npm:@kaiserlich-dev/pi-session-search",
    "npm:pi-schedule-prompt",
    "npm:@samfp/pi-memory",
    "npm:@tmustier/pi-ralph-wiggum"
  ]
}

Fields

FieldTypeDescription
defaultProviderstringThe model provider to use by default (e.g., anthropic, openai)
defaultModelstringThe model ID within the provider (e.g., claude-sonnet-4-5)
defaultThinkingLevelstringReasoning depth: off, minimal, low, medium, high, xhigh
packagesstring[]List of Pi package sources. Managed by feynman packages commands
The defaultProvider and defaultModel fields are set together. Use feynman model set <provider/model> to change them — this is the recommended way to update your model.

Model configuration

The defaultProvider and defaultModel fields together select which model Feynman uses when you launch without the --model flag. The format for feynman model set uses a / or : separator:
feynman model set anthropic/claude-opus-4-5
feynman model set openai/gpt-5
feynman model set google/gemini-2.5-pro
To see all models you have configured and their authentication status:
feynman model list
Feynman ranks models by research suitability. In order of preference:
Model specReason
anthropic/claude-opus-4-6Strong long-context reasoning for source-heavy research work
anthropic/claude-opus-4-5Strong long-context reasoning for source-heavy research work
anthropic/claude-sonnet-4-6Balanced reasoning and speed for iterative research sessions
anthropic/claude-sonnet-4-5Balanced reasoning and speed for iterative research sessions
openai/gpt-5.4Strong general reasoning and drafting quality for research tasks
openai/gpt-5Strong general reasoning and drafting quality for research tasks
google/gemini-2.5-proGood fallback for broad web-and-doc research work

Thinking levels

The defaultThinkingLevel field controls how much extended reasoning the model applies before responding. This can be overridden per-session with --thinking or FEYNMAN_THINKING.
LevelDescription
offNo extended thinking
minimalMinimal reasoning pass
lowLight thinking
mediumDefault — balanced reasoning and speed
highDeep reasoning for complex research tasks
xhighMaximum reasoning budget
feynman --thinking high

Environment variables

Environment variables take precedence over settings.json. Set them in your shell profile or in a .env file in your working directory.

Feynman runtime

VariableDescriptionDefault
FEYNMAN_MODELOverride the default model (e.g., anthropic/claude-sonnet-4-5)
FEYNMAN_THINKINGOverride the thinking levelmedium
FEYNMAN_HOMEOverride the config directory~/.feynman

Model provider API keys

VariableProvider
ANTHROPIC_API_KEYAnthropic (Claude)
OPENAI_API_KEYOpenAI (GPT)
GEMINI_API_KEYGoogle Gemini
OPENROUTER_API_KEYOpenRouter
ZAI_API_KEYZ.AI / GLM
KIMI_API_KEYKimi / Moonshot
MINIMAX_API_KEYMiniMax
MINIMAX_CN_API_KEYMiniMax (China region)
MISTRAL_API_KEYMistral
GROQ_API_KEYGroq
XAI_API_KEYxAI (Grok)
CEREBRAS_API_KEYCerebras
HF_TOKENHugging Face
OPENCODE_API_KEYOpenCode
AI_GATEWAY_API_KEYVercel AI Gateway
AZURE_OPENAI_API_KEYAzure OpenAI Responses

Compute providers

VariableDescription
RUNPOD_API_KEYRunPod GPU pod provisioning (used by /replicate and /autoresearch)
MODAL_TOKEN_IDModal serverless GPU — token ID
MODAL_TOKEN_SECRETModal serverless GPU — token secret
API keys are read by the Pi runtime. You can also configure them via feynman model login for OAuth-based providers, or place them in ~/.feynman/auth.json for key-based providers. The auth.json file is managed by the Pi runtime.

Session management

Each conversation is persisted as a file in ~/.feynman/sessions/. Sessions allow you to resume previous conversations and maintain context across REPL restarts. To start a fresh session:
feynman --new-session
To store sessions in a different directory (useful for per-project isolation):
feynman --session-dir ~/projects/myproject/.sessions
The --session-dir flag is particularly useful when you want separate conversation histories for different research projects.

Diagnostics

Run feynman doctor to verify your configuration, check authentication status for all configured providers, and detect missing optional dependencies:
feynman doctor
The doctor command outputs a checklist showing what is working and what needs attention — including missing API keys, unauthenticated providers, missing preview dependencies, and Pi runtime issues.
feynman status
feynman status shows a compact summary of your current setup: active model, auth status, installed packages, and session directory.