A goal-directed design process for AI coding agents. Derives screens from user goals — not the other way around.
Based on Alan Cooper's Goal-Directed Design, this skill fills the gap that every technical spec skill ignores: understanding the human problem deeply before writing a single line of code.
Guides you through a structured design thinking process:
- Competitive Teardown — What exists, what's bloated, what's missing
- User Goals — Life goals, experience goals, end goals (not features)
- Context Scenarios — Real narratives that reveal requirements
- Requirements Extraction — Discovered from scenarios, not brainstormed
- Interaction Framework — Modes the user is in, not screens
- Screen Derivation — Screens emerge from modes + scenarios
- Design Principles & Experience — What it should feel like
Includes an autonomous interview protocol for gathering requirements through real conversations with target users. Builds a goal tree based on the person's role, adapts questions based on responses, and tracks coverage.
npx skills add storypixel/design-discovery-skillEvery AI coding skill jumps straight to PRDs, wireframes, and architecture. None of them ask "who is this for and what do they actually need?" This skill is the thinking that should happen before all of that.
A therapy app that doesn't understand a 2am anxiety spike will fail no matter how clean the SwiftUI code is.
Generates a DESIGN_DISCOVERY.md that feeds into downstream skills:
- PRD generators
- Architecture specs
- UX specifications