How We Rank Design Studios
Complete methodology, scoring criteria, and editorial standards for design studio rankings.
Ranking Philosophy
Our design studio rankings exist to help businesses and organizations find the right creative partner for their specific needs. We evaluate studios across multiple dimensions because "best" depends on context—a studio that excels at brand identity for Fortune 500 companies may not be the right fit for a startup's product design needs.
Our approach balances:
- Objective criteria (portfolio analysis, awards, team credentials)
- Verified outcomes (client results, project success, business impact)
- Process quality (communication, collaboration, transparency)
- Consistency (sustained excellence vs. one-hit wonders)
Scoring Criteria & Weights
Each studio receives a score from 0-10 across six categories. These scores combine into an overall rating using the weights below:
Portfolio Quality (25%)
Highest weightWhat we evaluate: Originality, craft excellence, strategic thinking, execution quality, and work diversity across 5-10 projects per studio.
How we score: Three independent editors review portfolio work blind (studio names hidden initially), scoring projects on a 10-point rubric covering concept, visual craft, typography, systems thinking, and appropriate execution.
Sources: Public portfolios, case studies, award submissions, and press features.
Client Results (20%)
Second highestWhat we evaluate: Measurable business outcomes, client growth, market performance, brand recognition, and award wins (client-side awards like Effies, not just design awards).
How we score: Analysis of public case studies for outcome data, tracking of client businesses post-launch, award databases, and when possible, direct client interviews (conducted anonymously).
Sources: Studio case studies, client press releases, award databases (Effie, Cannes Lions business results categories), financial news for public companies.
Industry Recognition (15%)
What we evaluate: Awards from respected institutions (D&AD, ADC, Webby, Red Dot, Fast Company Innovation), press coverage quality, speaking engagements, and thought leadership.
How we score: Award tracking across past 5 years, weighted by award prestige (D&AD Pencils > regional design awards). Press mentions analyzed for quality (Wired, Fast Company > generic listicles).
Sources: Award databases, design publication archives, conference speaker lists.
Process & Collaboration (15%)
What we evaluate: Client testimonials, communication quality, project management transparency, responsiveness, and ability to meet deadlines.
How we score: Analysis of public testimonials, case study descriptions of process, Clutch/similar review platforms (weighted skeptically), and reputation within the design community.
Sources: Client testimonials on studio sites, review platforms, designer community discussions (carefully vetted).
Team Expertise (15%)
What we evaluate: Partner/leadership credentials, team structure, specialist capabilities, cross-disciplinary depth, and staff retention.
How we score: Review of leadership backgrounds, team page analysis, LinkedIn data (carefully, to avoid privacy issues), and specialist presence (e.g., motion designers, brand strategists, researchers).
Sources: Studio websites, LinkedIn (public profiles only), conference speaker bios, design publication author pages.
Innovation (10%)
What we evaluate: Use of emerging techniques, experimental work, contributions to design discourse, and pushing category boundaries.
How we score: Assessment of whether studios advance the field vs. execute established patterns, experimental project presence, tool/technique innovation, and published thinking.
Sources: Portfolio review, design publication features, tool/technique case studies.
Evaluation Process
Step 1: Studio Identification & Eligibility
For each city, we compile a comprehensive list of studios through:
- Design award winner databases (past 5 years)
- Design publication coverage and "best of" lists (critically reviewed)
- Conference speaker rosters
- Referrals from industry sources
- Studios suggested by readers (via contact form)
Eligibility requirements:
- At least 2 years of operation under current name/ownership
- Minimum 5 verifiable projects with public portfolio
- Active operation (studios must be currently accepting projects)
- Primary physical presence in the city (not just a mailing address)
Step 2: Independent Evaluation
Three editors independently score each studio across all six criteria. Editors do not confer during initial scoring to maintain independence. For portfolio evaluation, studio names are hidden to reduce bias.
Step 3: Scoring Calibration
After independent scoring, editors meet to:
- Identify and discuss score outliers (>2 point differences)
- Review evidence and adjust scores where new information emerges
- Ensure consistent interpretation of scoring rubrics
- Document reasoning for close rankings
Final scores are the average of three editors' calibrated scores.
Step 4: Verification & Fact-Checking
Before publication:
- All factual claims (founding dates, locations, project credits) are verified against primary sources
- Project outcomes are validated against public records when possible
- Studios are given opportunity to correct factual errors (but not dispute scores)
- Pricing estimates are cross-referenced with industry sources and project value data
Step 5: Publication & Transparency
Published rankings include:
- Overall score and individual category scores for top-ranked studios
- Last updated date
- Number of studios evaluated
- Link to this methodology
- Disclosure of any conflicts of interest (even perceived ones)
Data Sources
| Source Type | Examples | Reliability Weight |
|---|---|---|
| Primary Sources | Studio portfolios, case studies, published interviews | High - directly verifiable |
| Award Databases | D&AD, ADC, Webby, Red Dot, Fast Company | High - objective records |
| Design Publications | Design Observer, It's Nice That, Eye on Design | High - editorial standards |
| Client Testimonials | Studio websites, case studies | Medium - self-selected but useful |
| Review Platforms | Clutch, GoodFirms | Low - verification challenges |
| Community Discussion | Designer forums, social media | Low - requires careful vetting |
What Does NOT Influence Rankings
We Do Not Accept:
- Payment for inclusion or favorable placement
- Advertising relationships in exchange for rankings
- Affiliate commissions from studios
- Free services or gifts from studios being ranked
- Requests to exclude competitors
- Pressure to change scores after publication
Update Cadence
Design studio rankings are living documents:
- Quarterly reviews: Every 3 months, we review rankings for each city, updating scores based on new projects, awards, or market changes.
- Immediate updates: Studio closures, major leadership changes, or significant quality shifts trigger out-of-cycle updates within 30 days of verification.
- Annual deep dives: Once per year, we re-evaluate methodology and expand the number of studios reviewed per city.
- Changelog tracking: All ranking changes are documented in our public changelog with dates and reasoning.
Adding New Studios
We continuously monitor for emerging studios. If you believe a studio should be evaluated:
- Submit a suggestion with studio name, location, and why it should be considered
- We'll review against eligibility criteria within 10 business days
- If eligible, the studio enters the next quarterly review cycle
- Studios are evaluated using the same process as all existing listings
Important: Suggesting a studio does not guarantee inclusion. Studios must meet eligibility requirements and score competitively within their market.
Corrections & Disputes
Factual Errors
If you identify incorrect information (wrong address, outdated pricing, project misattribution), submit a correction with supporting evidence. We investigate all corrections within 5 business days and publish updates with acknowledgment.
Score Disputes
Studios cannot dispute subjective rankings (e.g., "we think we should be #1 instead of #3"). However, if a studio believes their score was based on factual errors or evaluation mistakes, they can:
- Contact us via official form with specific concerns
- Provide evidence of evaluation errors (not just disagreement)
- We'll review with fresh editors and respond within 15 business days
If a re-evaluation finds errors, we update scores and publish a correction notice. If our original evaluation stands, we explain the reasoning.
Editorial Independence
Rankin's rankings are editorial decisions made by our team. We maintain independence through:
- No advertising relationships with ranked studios
- No affiliate revenue from studio referrals
- No shared ownership with studios or holding companies
- Editors do not accept free services, gifts, or hospitality from studios
- Full disclosure if any conflict of interest emerges
Our business model is planned to include future advertising, but advertisers will never receive preferential ranking treatment. All ads will be clearly labeled.
Limitations & Acknowledgments
Our methodology has inherent limitations:
- Portfolio analysis is subjective - While we use multiple editors and rubrics, design evaluation involves taste and judgment.
- Public information only - We can't access confidential client data, so outcomes are based on publicly available information.
- Delayed recognition - Newly launched studios need time to build portfolio and recognition before competitive scoring.
- Geographic coverage - We currently cover major U.S. cities; international expansion is planned but incomplete.
- Category breadth - Design studios span huge range of specializations; rankings may not capture every niche perfectly.
Frequently Asked Questions
Do studios know they're being ranked?
We do not notify studios before initial ranking publication. After publication, studios may request factual corrections. This approach prevents pre-publication pressure or preferential access.
Can studios opt out of rankings?
No. Rankings are editorial content based on publicly available information, similar to journalism. Studios cannot opt out, but they can request correction of factual errors.
Why don't you share individual category scores for all studios?
We publish overall scores for all ranked studios and detailed breakdowns for top 10. Publishing full breakdowns for 50+ studios per city creates information overload without added value for most readers.
How do you handle studios with multiple offices?
Multi-office studios are evaluated per location. If a studio has offices in New York and Los Angeles, each office is ranked separately within its city, based on that office's specific work, team, and capabilities.
Do you rank freelancers or solo practitioners?
Currently no. Our rankings focus on studios (defined as 2+ full-time designers). We may add freelancer rankings in the future with different methodology.
Methodology Evolution
We continuously refine our approach based on feedback, industry changes, and our own learning. Material methodology changes are published with 30 days notice and explained in our changelog. Past methodologies remain archived for transparency.
Current version: 1.0 (Launched December 2025)