Illustration for: How Coinbase Achieved 100% AI Coding Adoption with Cursor
Real AI Stories
🚀 Advanced

How Coinbase Achieved 100% AI Coding Adoption with Cursor

Coinbase reached 100% Cursor adoption, cutting project timelines from months to days while tracking quality metrics.

TL;DR

  • Coinbase achieved 100% Cursor adoption across all engineers by February 2025
  • Project timelines dropped from months to days for refactoring and migrations
  • Used DORA metrics to measure both gains and quality degradation
  • Best for: Enterprise teams considering organization-wide AI coding tool rollout
  • Key lesson: Universal adoption requires measuring failures, not just celebrating speed

Coinbase proved that 100% AI coding tool adoption is achievable at enterprise scale—but only by honestly measuring both the productivity gains and the quality tradeoffs.

By February 2025, every engineer at Coinbase used Cursor.

Not “had access to.” Not “sometimes opened.” Used. Every day. 100% adoption across the engineering organization.

Most companies would celebrate that number and move on. Coinbase did something unusual: they measured what went wrong.

“Universal adoption is easy to announce. Understanding what it actually changed—the good and the bad—requires discipline.”

The transformation

Coinbase CEO Brian Armstrong didn’t mince words about the impact:

“Single engineers are now refactoring, upgrading, or building new codebases in days instead of months.”

Days instead of months. That’s not incremental improvement. That’s a different velocity category.

Projects that required teams now required individuals. Codebases that seemed frozen due to migration complexity became changeable again. Technical debt that no one wanted to touch became addressable.

“We had engineers taking on projects they would have rejected a year ago. Not because they got braver. Because AI made the work tractable.”

The DORA framework

Coinbase didn’t guess whether AI helped. They measured using DORA metrics—the industry-standard framework for engineering performance.

Deployment Frequency: How often code ships to production. Lead Time for Changes: How long from commit to deploy. Change Failure Rate: How often deployments cause problems. Mean Time to Recovery: How fast problems get fixed.

Before AI, they had baselines. After AI, they had comparison points.

“Everyone talks about AI making developers faster. We built the measurement infrastructure to prove it—and to catch it when the opposite happened.”

The uncomfortable truth

Here’s what makes Coinbase’s transparency valuable: they documented the failures.

Increased AI usage correlated with increased bugs.

Not every bug. Not catastrophic failures. But measurable degradation in certain quality metrics as AI assistance increased.

“The pattern made sense once we saw it. AI-generated code shipped faster. That meant more code overall. More code means more opportunities for defects.”

Coinbase didn’t hide this finding. They built monitoring around it.

Quality gates that catch AI-introduced regressions. Review processes calibrated for AI-assisted commits. Testing automation that keeps pace with increased output.

“Velocity without quality is just faster failure. We optimized for both.”

What 100% adoption actually means

Every engineer using Cursor doesn’t mean every task uses AI.

The reality is more nuanced. Engineers developed judgment about when AI helps and when it hinders.

AI excels at: Boilerplate generation, test scaffolding, documentation, unfamiliar codebase navigation, syntax in new languages.

AI struggles with: Deep architectural decisions, security-critical code, complex business logic, nuanced performance optimization.

“We didn’t mandate AI everywhere. We gave everyone AI access and let them figure out where it mattered. That’s how you get genuine 100% adoption instead of resentful compliance.”

The developer experience

Coinbase’s engineering culture prized developer experience before AI. That made AI adoption smoother.

When engineers have strong opinions about their tools—when they expect their tooling to be excellent—they engage with new capabilities seriously.

“Our developers aren’t passive tool users. They’re tool critics. When we rolled out Cursor, we got immediate feedback about what worked and what didn’t. That feedback shaped our deployment.”

The result: an AI implementation that reflected actual engineering needs rather than management assumptions.

Lessons from 100%

Coinbase’s journey to universal AI coding adoption offers templates for other organizations:

Measure the negatives. Bug correlation with AI usage isn’t a reason to abandon AI. It’s a signal to build countermeasures. You can’t fix what you don’t measure.

Let adoption happen naturally. Mandating tool usage creates compliance. Providing excellent tools creates adoption.

Preserve human judgment. AI handles the routine. Humans handle the critical. The boundary isn’t fixed—it shifts as trust develops.

Speed is a feature, not the goal. Days instead of months matters only if what ships in days works as well as what used to ship in months.

“We didn’t just move faster. We moved faster while maintaining our quality bar. That’s the actual achievement.”

The future

Coinbase isn’t finished with AI integration. 100% Cursor adoption is a foundation, not a destination.

Code review assistance. Automated security scanning. Architecture suggestion systems. Each builds on the foundation of engineering teams comfortable with AI collaboration.

“We proved the baseline: universal adoption is possible and beneficial. Now we’re exploring what’s possible when every engineer has an AI collaborator by default.”

The 100% company is just getting started.

FAQ

How long did it take Coinbase to reach 100% Cursor adoption?

Coinbase achieved 100% adoption by February 2025. The exact timeline from initial rollout varies, but the key was letting adoption happen naturally rather than mandating it.

Did AI coding tools actually improve productivity at Coinbase?

Yes, dramatically. Single engineers now complete refactoring and migration projects in days instead of months. Tasks that previously required teams became tractable for individuals.

What are DORA metrics and why do they matter for AI adoption?

DORA metrics measure engineering performance: deployment frequency, lead time, change failure rate, and recovery time. Coinbase used these to objectively measure AI's impact rather than relying on anecdotes.

Did AI coding cause more bugs at Coinbase?

Yes, increased AI usage correlated with increased bugs. Coinbase addressed this by building quality gates, calibrating review processes for AI-assisted commits, and scaling testing automation.

What tasks should you avoid using AI coding tools for?

Coinbase found AI struggles with deep architectural decisions, security-critical code, complex business logic, and nuanced performance optimization. AI excels at boilerplate, test scaffolding, and documentation.