Human-participated design iteration loop driven by claude.ai/design via the `designer` MCP. The human is the designer; AI is translation + plumbing. Human states intent, AI reads what exists and relays intent to Claude Design, human tastes the variants Claude produces, AI interprets reactions and iterates. Promotes accepted result with a decision record (the bundle's chat transcript).
AnyCap CLI -- create media humans can see and hear (generate/edit images, produce video, compose music), understand media humans share (analyze images, video, audio), access the web (search, crawl), and deliver results humans can use (Drive for shareable file links, Page for hosted web pages). Use whenever a task involves creating visual or audio content, analyzing media, searching or reading the web, sharing files with humans, or publishing anything as a web page -- even if the user doesn't mention AnyCap by name. Also use for AnyCap authentication (login, API key, credentials), configuration, and feedback. Trigger on: image/video/music generation, media analysis, web search, web crawl, file sharing, page hosting, drive storage, delivering results to users, or any mention of AnyCap.
- 📁 references/
- 📄 AGENTS.md
- 📄 GEMINI.md
- 📄 SKILL.md
Write text that avoids all known AI writing patterns and passes human detection. Use when writing articles, essays, blog posts, social media copy, or any content that must read as authentically human. Triggers on requests to write naturally, avoid AI slop, avoid AI detection, humanize writing, write like a human, or make text sound authentic.