FAQ
What is yaku?
Section titled “What is yaku?”yaku (訳, Japanese for “translation”) is a CLI tool that translates text, files, and developer artifacts from your terminal. It reads from stdin, files, or arguments and writes translated text to stdout — like jq or sed, but for natural language.
How is yaku different from other translation tools?
Section titled “How is yaku different from other translation tools?”See Why yaku? for a detailed comparison with Google Translate, general-purpose AI CLIs (llm, sgpt, aichat), and Translate Shell.
Is yaku free?
Section titled “Is yaku free?”The CLI is open source (MIT license) and free to use with your own API key. The hosted service at api.yakulang.com works with no setup and includes a free tier. See Hosted Service & Plans for tier details and quota limits.
Which LLMs does yaku support?
Section titled “Which LLMs does yaku support?”Four backends:
| Backend | Default model | How to use |
|---|---|---|
Hosted (api.yakulang.com) | Server-side | Default — works with no setup |
| Google Gemini | gemini-2.5-flash | yaku config set backend gemini |
| OpenAI | gpt-4o-mini | yaku config set backend openai |
| Anthropic | claude-haiku-4-5-20251001 | yaku config set backend anthropic |
The OpenAI backend also works with any OpenAI-compatible API (Groq, Together.ai, DeepSeek, Ollama). See Backends.
Which languages does yaku support?
Section titled “Which languages does yaku support?”Any language your chosen LLM backend can translate. yaku does not maintain a hardcoded language list — it passes the language code directly to the model. See Languages.
Does yaku store my text?
Section titled “Does yaku store my text?”Local backends (Gemini, OpenAI, Anthropic): yaku sends your text directly to the LLM API and returns the result. Nothing is stored by yaku. Review your LLM provider’s privacy policy for their data handling.
Hosted backend: your text is sent to api.yakulang.com for translation. The hosted service does not persist your text after the response is returned.
Can I use yaku offline?
Section titled “Can I use yaku offline?”Yes, with a local model via the OpenAI-compatible backend. For example, with Ollama:
yaku --backend openai \ --api-base http://localhost:11434/v1 \ --model llama3 \ --to en "Bonjour"See Backends — OpenAI-compatible providers.
How do I ensure consistent terminology?
Section titled “How do I ensure consistent terminology?”Use a glossary file. Create .yaku-glossary.yaml in your project root:
zh-TW: Kubernetes: ~ # keep in English container: 容器 # always translate to thisyaku auto-loads it and injects the terms into every translation prompt.
Can I use my own LLM prompt?
Section titled “Can I use my own LLM prompt?”Yes, with local backends. The --prompt flag replaces yaku’s built-in system prompt entirely:
yaku --to en --prompt my-prompt.txt -f docs.ja.mdCustom prompts only work with local backends (Gemini, OpenAI, Anthropic). The hosted backend uses server-side prompts and ignores --prompt. See Custom Prompts.
How do I report a bug?
Section titled “How do I report a bug?”Open an issue with your yaku version (yaku version), the command you ran, and the error message.