UX Myths That Are Costing You

In the world of user experience design, certain “rules” and assumptions have taken on lives of their own.

UX Myths That Are Costing You
UX Myths That Are Costing You
We've spent years testing, analyzing, and sometimes arguing about what really works in user experience design. And you know what? Some of the most widely repeated "rules" in UX are complete myths. Worse, they're actively hurting your website's performance.

After analyzing millions of user interactions and running countless tests, we've compiled the truth about seven persistent UX myths that just won't die. Let's bust them once and for all.

Myth #1: Homepage Carousels Are Great for Engagement

The Reality: They're Where Engagement Goes to Die

Here's a statistic that should make every marketer pause: when we tracked nearly 4 million homepage visits, only 1% of users clicked on carousel features. One percent.

And it gets worse. Of that tiny fraction who did engage, 84% clicked exclusively on the first slide. The remaining slides? They might as well not exist. Each subsequent slide received only about 4% of clicks—a devastating 20x drop-off from the first position.

We've run head-to-head tests comparing the exact same content as a carousel versus a static image. The results were eye opening: the static version generated 20 times more engagement than the rotating carousel. Not 20% more. Twenty times more.

Why do carousels fail so spectacularly? Through eye-tracking studies, we discovered that users treat them like banner ads—they've learned to ignore anything that moves and rotates automatically. One user perfectly captured the problem while searching for washing machine deals on a home appliance site: "I didn't have time to read it. It keeps flashing too quickly." Meanwhile, a $100 discount promotion rotated past every 5 seconds, completely unnoticed.

What to do instead: Replace homepage carousels with a single, powerful hero image or a clear call-to-action. If you absolutely must show multiple messages, use static cards or sections that users can scroll through at their own pace.

One exception: Product detail page carousels showing different angles of the same product perform significantly better (around 23% interaction on mobile). The key difference? Users actively want to see more views of a product they're already interested in, versus being randomly bombarded with unrelated promotions.

Myth #2: Users Will Leave If They Can't Find Something in Three Clicks

The Reality: Users Don't Count Clicks—They Follow Clear Paths

The "three-click rule" is one of the most persistent myths in web design, and it needs to die. We tracked 44 users completing 620 different tasks—over 8,000 clicks total. Want to know what we found?

Zero correlation between the number of clicks and task completion success. Zero correlation between clicks and user satisfaction.

Users clicked up to 25 times without abandoning tasks, as long as each click made sense and moved them closer to their goal. In one dramatic case, we redesigned an e-commerce site to make products accessible in 4 clicks instead of 3, but with much clearer navigation. The result? A 600% increase in users' ability to find products.

Here's what users actually care about: confidence that they're on the right path. We call this "information scent"—the cues that tell users whether they're getting warmer or colder in their search. Strong, descriptive labels and clear wayfinding let users happily click a dozen times without frustration.

As one of our favorite UX principles states: three mindless, unambiguous clicks beat one click that requires thought.

What to do instead: Stop counting clicks in your analytics. Instead, focus on creating clear, descriptive navigation labels and strong visual hierarchies. Test whether users understand their options at each step. A user who confidently clicks 10 times is having a better experience than one who clicks twice but feels lost.

Myth #3: Five Users Are Enough for Usability Testing

The Reality: It Depends on What You're Testing

This myth is partially true, which makes it extra dangerous. Yes, testing with 5 users can uncover 85% of usability problems—but only under specific conditions.

Through years of research, we've learned that 5 users works brilliantly for identifying problems in qualitative testing. If you're trying to discover what's broken, 5 users will show you the major issues.

But here's what doesn't work: using 5 users to measure how severe problems are, to make quantitative decisions, or to validate if your fixes actually worked. For that, you need 20+ users to get statistically meaningful data.

And if you have multiple distinct user groups? Multiply accordingly. Testing 5 teachers and 5 students doesn't give you enough data about either group.

What to do instead: Think of 5-user testing as detective work—perfect for finding problems early and iterating quickly. We typically run multiple rounds of small tests rather than one big study. But when you need to measure severity, prove ROI, or make data-driven decisions, invest in larger sample sizes.

Our sweet spot: Run several 5-user studies during design iteration, then validate with 20+ users before major releases.

Myth #4: Multi-Step Forms Reduce Completion Rates

The Reality: Perceived Effort Matters More Than Step Count

Everyone fears multi-step forms because conventional wisdom says more steps equals more abandonment. But our testing reveals a more nuanced truth.

Breaking a long form into multiple steps can actually improve completion rates by up to 10-35%—when done right. The key is managing perceived effort and cognitive load, not minimizing steps.

Think about it: would you rather face a single page with 30 intimidating form fields, or three screens with 10 fields each, showing a progress indicator? The second option feels more achievable, even though the work is identical.

We've seen dramatic improvements just by chunking related information together logically and showing users their progress. One mortgage application form we redesigned went from a 20% completion rate to 46% simply by reorganizing the same fields into clearer sections.

The real villains: Unnecessary fields. We audit every form and ruthlessly eliminate fields that aren't absolutely essential. Cutting field count from 20 to 12 often has more impact than any step optimization.

What to do instead: Test your specific form with your specific audience. If breaking it into steps reduces cognitive load and makes progress feel achievable, do it. But first, ask yourself: do we really need all these fields?

Myth #5: Hamburger Menus Are Bad for Usability

The Reality: Context Is Everything

The great hamburger menu debate has raged for years, and our testing shows both sides are partially right—which means both are also partially wrong.

On desktop sites, visible navigation typically outperforms hidden hamburger menus by significant margins. One of our tests showed a 20-30% increase in engagement when we replaced a hamburger menu with visible links.

But on mobile devices with 5+ navigation items? Hamburger menus often perform just as well as visible menus, and sometimes better because they don't clutter the limited screen space.

What to do instead: Follow these guidelines based on our extensive testing:

  • Desktop: Keep primary navigation visible. Users expect to see main options.
  • Mobile with few items (3-4): Consider visible bottom navigation or tabs.
  • Mobile with many items (5+): Hamburger menus work fine. Users understand the pattern.
  • Always: Make sure your most important actions are immediately visible, regardless of menu style.

Myth #6: Users Don't Scroll Below the Fold

The Reality: Most Users Spend 2.5 Hours Daily "Dreamscrolling"

This myth was born in the early 2000s when users were still learning web behaviors. Today? Scrolling is second nature.
 
Recent data shows the average person spends 2.5 hours per day scrolling through content. Platforms like blogs, news sites, and every social media site thrive on continuous scrolling.
 
The real question isn't whether users will scroll—it's whether your content is worth scrolling for.
 
What to do instead: Put your most important message and call-to-action at the top, but don't be afraid to tell your full story. Use engaging visuals, clear headings, and compelling content to draw users deeper. We've seen blog posts with 3,000+ word counts outperform short posts because the content was genuinely valuable.
 
Test what works for your audience. Use heatmaps to see how far users scroll and which content captures attention.

Myth #7: Accessibility Is Optional or Can Be Added Later

The Reality: It's Legally Required and Good for Everyone

This isn't just about doing the right thing anymore (though that should be enough). Accessibility is now legally mandated with specific deadlines and substantial penalties for non-compliance.

Recent regulations require websites to meet WCAG 2.1 Level AA standards, with compliance deadlines hitting in 2025 and 2026 depending on your location. We're also seeing an average of 8,800 accessibility-related lawsuits filed annually, with a 7% year-over-year increase.
 
But here's the thing: when we analyzed the top million websites, 96.3% had accessibility failures, averaging 51 errors per page. This isn't because people don't care—it's because they treat accessibility as an afterthought.
 
Accessible design benefits everyone. Clear color contrast helps users in bright sunlight. Keyboard navigation helps power users who prefer shortcuts. Clear labels help everyone understand your interface faster.
 
What to do instead: Build accessibility in from day one. Include users with disabilities in your testing. Run both automated tests (which catch about 30% of issues) and manual evaluation with actual assistive technologies. Budget for ongoing audits, not one-time fixes.

The Bottom Line

After years of testing and millions of user interactions, we've learned that many popular UX "rules" are oversimplifications at best and harmful at worst. What actually works?
 
Test everything in your specific context. A practice that works brilliantly for e-commerce might fail for B2B software. What succeeds with millennials might frustrate older users.
 
Focus on user intent, not arbitrary rules. Users don't care about your click count or step count—they care about accomplishing their goals efficiently and confidently.
 
Iterate constantly. The best-performing websites aren't the ones that followed all the rules. They're the ones that tested, learned, and refined based on real user behavior.
 
The UX field is evolving rapidly, with AI tools, mobile dominance, and accessibility requirements reshaping everything we thought we knew. The designers who succeed aren't the ones who memorize rules, they're the ones who question assumptions and let data guide their decisions.
 
So the next time someone tells you that users won't scroll, or you must achieve everything in three clicks, or carousels increase engagement, ask them: "Have you tested that?"

Because we have. And the results might surprise you.

Remarkable outcomes begin with the right partnership.
Excited to work with you!