Your Company’s “North Star Metric” Is Just a Number You Picked Because It Goes Up

Strategy

The North Star Metric is one of the most important concepts in modern growth strategy. It aligns teams, focuses execution, and gives the entire organization a single measurable goal to rally behind. It is also, in nearly every case, whichever number happened to be going up the quarter someone asked the growth team to find a North Star Metric.

We analyzed the KPI strategies of 214 mid-stage startups over a three-year period. The pattern was so consistent it barely qualifies as a finding: companies adopt a North Star Metric roughly two to six weeks after that metric begins trending upward, and they abandon it roughly two to six weeks after it stops. The average North Star Metric has a lifespan of 4.7 months. The average employee can name their company's current one about 31% of the time.

The Rotation Cycle

Every company follows the same trajectory, and it always starts with revenue. Revenue is the North Star when revenue is growing. The CEO says things like "we're a revenue-first organization" and "everything ladders up to ARR." Dashboards are built. OKRs cascade downward. For a while, it works, because the number is going up and any framework looks brilliant when applied to a line that's already ascending.

Then revenue flattens. Maybe the market shifts. Maybe a big contract churns. Maybe the sales cycle just gets longer as you move upmarket. Whatever the reason, the quarterly board deck needs a new story, and within weeks, the North Star quietly migrates to Daily Active Users. The language changes overnight. The CEO now says "we're building a platform, and platforms are measured by engagement, not near-term monetization." Nobody mentions that revenue is flat. The new dashboard is already in Looker.

DAUs have a good run, usually one or two quarters, until someone notices that users are logging in but not really doing anything. At that point, the North Star becomes "engagement minutes" or "time in app" or, at one company we studied, "meaningful session duration," which was defined as any session longer than 45 seconds. The product team is told to optimize for depth, not breadth. Push notifications increase by 300%.

When time-in-app plateaus—often because users begin to resent the notifications—the final evolution occurs. The North Star becomes something no one can actually measure with confidence: "quality interactions," "moments of value," or the perennial favorite, "customer love." These metrics require custom survey instruments and six-week data lags, which means they can't be disproven in real time. This is not a bug. This is the point.

The Board Presentation Layer

The board deck is where the North Star Metric becomes performance art. A well-constructed deck never acknowledges that the metric changed. It simply presents the current metric as though it has always been the metric, with a carefully chosen time window that shows an upward slope. If the current metric has only been tracked for nine weeks, the x-axis starts nine weeks ago. If it requires a trailing 90-day average to look good, it uses a trailing 90-day average. If it only looks good when segmented by enterprise accounts in North America who onboarded after the redesign, then that is the cohort that appears on slide 7.

Board members, to their credit, sometimes notice. "Didn't we used to track monthly active users?" a director will ask. The response is always some version of: "We still track MAU, but we found that [new metric] is a better leading indicator of long-term value creation." This sentence has never once been verified. It doesn't need to be. It just needs to be said with enough conviction to get to slide 8.

The Analysts Who Build the Dashboards

Spare a thought for the data team. Every North Star migration generates approximately 40 hours of analytics work: new dashboards, new definitions, new segments, new Slack alerts, new weekly reports. The senior analyst at one Series C company told us she had built North Star dashboards for revenue, DAU, weekly active teams, "activation score," and something called "ecosystem velocity," all within eighteen months. Each time, she was told this was the real one. Each time, she archived the previous one to a folder she privately titled "Former Stars."

The definitions are the hardest part. "Engagement" means something different every time it becomes the North Star. In Q1, an engaged user might be someone who logs in three times a week. By Q3, when that threshold yields a flat line, an engaged user is someone who "performs a core action," which is itself redefined quarterly. The analyst builds the query. The analyst does not ask why. The analyst has a mortgage.

Why the Metric Is Never Wrong

The North Star framework isn't flawed because it picks the wrong metric. It's flawed because the selection process is, in practice, indistinguishable from confirmation bias. The metric is chosen because it validates the current narrative. When the narrative changes—because the market changes, or the board changes, or the VP of Growth changes—the metric changes with it. The framework provides the vocabulary of rigor without the constraint of it.

This is why no one ever proposes a North Star Metric that is currently going down. It has never happened. Not once, in 214 companies. The metric that would tell you the most about your business—the one that's declining, the one that's uncomfortable, the one that would require actual strategic change to move—is never the North Star. It's on a dashboard somewhere, in a tab nobody opens, built by an analyst who already knows.

The Star That Holds Still

Actual north stars don't move. That's the whole point of the metaphor. You're supposed to pick the fixed thing in a sky full of things that rotate. But in practice, the corporate North Star rotates with everything else, and the only fixed point is the quarterly need to present a number that goes up and to the right.

Somewhere in your company, there is a metric that has been flat for two years. It isn't on any dashboard. It doesn't appear in the board deck. No one has set an OKR against it since the seed round. It might be the most honest thing your data team has ever measured. But it doesn't move, so nobody looks at it. That's the problem with real north stars. They're only useful if you're willing to hold still long enough to navigate by them.

More From the PoopOS Blog

We write about the things everyone in your organization already knows but nobody is allowed to say in a meeting.