How Taste Works
Capture what you love. AI extracts the design language underneath. Your taste synthesizes into a live profile that your coding tools read on every prompt.
Multiple Signals, One Understanding
Taste doesn’t rely on a single method. It triangulates your preferences from several types of input.
What AI Sees in Every Screenshot
Every screenshot is analyzed across multiple layers of design language. Not just colors — the full picture.
Color Palette
Roles and relationships
Typography
Scale, weight, hierarchy
Spacing
Density and rhythm
Composition
Layout and structure
Shape Language
Corners and forms
Depth & Shadow
Elevation and layering
Platform
iOS, web, desktop, mobile
Design Identity
Visual mood and feel
Powered by multimodal AI that understands interfaces the way a designer does — not pixel by pixel, but holistically.
Taste in Six Dimensions
Your design preferences are mapped across six perceptual axes. Each one captures a distinct aspect of visual style that your AI tools use to calibrate their output.
A System That Learns
Taste isn’t a one-time export. It’s a continuous loop that gets sharper the more you use it.
Every screenshot you capture, every swipe you make, and every design you save refines your taste profile in real time. Your AI tools get smarter about your preferences with every interaction — not stuck on day one.
What You Don’t Like Matters Too
Most personalization systems only learn from positive signals. Taste is different.
Both signals feed into your taste profile. The result: AI output that feels like yours, not a generic default.
Your Taste Adapts to What You’re Building
What looks right for a navigation bar is different from what works on a landing page. As your library grows, Taste builds specialized guidance for each type of interface you care about.
When you’ve captured enough examples in a category, Taste generates targeted guidance specific to that surface type. Your AI gets context-specific direction, not one-size-fits-all advice.
A Skill File Your AI Actually Reads
Taste generates a live skill file that auto-syncs to your AI coding tools. Every prompt starts with your taste baked in.
What’s in your skill file
- Your synthesized taste profile — a qualitative summary of your design sensibility
- Anti-preferences — explicit patterns your AI should avoid
- Per-category guidance — targeted direction for navigation, dashboards, landing pages, and more
- Reference samples — your best screenshots with observed design characteristics
Works Everywhere You Do
Capture on your Mac. Browse on the web. Upload from your phone. Everything stays in sync.
Relevant Research
Taste is informed by frontier research in preference learning, multimodal AI, and computational aesthetics.
DesignPref: Capturing Personal Preferences in Visual Design Generation
Peng, Bigham & Wu · CMU & Apple · 2025
Personalized models outperform aggregated ones using 20× fewer examples
ViPer: Visual Personalization of Generative Models via Individual Preference Learning
Ostashev et al. · EPFL · ECCV 2024
Structured visual preference extraction from user feedback
Direct Preference Optimization: Your Language Model is Secretly a Reward Model
Rafailov et al. · Stanford · NeurIPS 2023
Outstanding Paper Runner-up — aligning AI directly from preference data
Ferret-UI 2: Mastering Universal User Interface Understanding Across Platforms
Li et al. · Apple · ICLR 2025
Multimodal UI understanding across iPhone, Android, iPad, web, and desktop
GPT-4 Technical Report
OpenAI · 2023
The multimodal foundation for vision-based design understanding
Where To Next? A Dynamic Model of User Preferences
Sanna Passino et al. · Spotify & Imperial College · 2024
Preference evolution modeling — taste isn’t static, tracking its trajectory matters
Ready?
Ready to teach your AI what good design looks like?
Sign up for free and start building your design taste profile.
Mac, Web & iOS