ARC // AI REALITY CHECK
← back to articles
Algorithm Analysis · Regression Study

Who Really Gets Seen on LinkedIn?

A clear-spoken look at the data behind reach, bias, and the hidden rules of the feed. What gets seen, what gets buried, and why your engagement metrics lie.
by C. Pete Connor  •  ~11 min read

I kept hearing friends say, "It doesn't matter what I post — big brands always drown me out." So I built a measurement system to check whether that gut feeling was true. Over six months I tracked thousands of LinkedIn posts, noting who published them, how people reacted, and most importantly how far each post actually traveled in the feed.

Big Names Get a Head-Start
Posts from corporate pages appeared in three times as many feeds before anyone clicked like or share.
−70%
Indie Voices Fade Fast
Independent posts lost 70% of their visibility within the first 24 hours — much faster than corporate posts.
−45%
Story Beats Likes
Just changing the angle of a story (without touching the image or headline) could cut reach nearly in half.

The regression analysis of 10,000 LinkedIn posts reveals the exact factors that determine who gets seen:

Being an independent creator: −69.37
Your posts reach ~70% fewer feeds just because you're not a company.
Institutional but not corporate: −16.41
Universities and similar organizations see a smaller but still significant penalty.
Likes: +0.38 per like
Each like adds less than half a view.
Comments: +0.30 per comment
Comments help, but not enough to overcome the baseline penalty.
Shares: +0.20 per share
Surprisingly, shares help less than likes or comments.

Technical details: R-squared of 0.901 — 90.1% of visibility variance explained. Based on regression of 10,000 posts with p < 0.001 (statistically significant).

Contradictory
−89%
Significant deviation
−68%
Moderate deviation
−45%
Minimal deviation
−15%
Fully aligned
0%

LinkedIn rewards posts that tap into high-authority networks. A single reshare by a Fortune 500 executive multiplied reach by . Yet the same post shared by five independent users added only +12%. The takeaway: who amplifies you matters more than how many.

PDF Scroll Capture
Every new post was grabbed by rapidly scrolling the feed and using "Print → Save as PDF." Each PDF captured the exact on-screen metrics at that moment. A parser extracted reach and engagement directly from the files, so the data reflects what LinkedIn actually shows.
Time-Lapse Tracking
Each post is revisited five more times over 48 hours. This reveals how quickly LinkedIn stops showing it to fresh eyes.
Hidden Metrics Audit
Using browser dev-tools I uncovered an authority_score field — never shown to users — that strongly predicts reach.
A/B Story Tests
I posted identical images with slightly different captions to measure how "on-message" language affects distribution.
Large Sample
More than 10,000 posts across industries, company sizes, and audience levels were monitored — enough data to drown out random noise.
Time-Stamped Tracking
Each post was captured within the first minute of publication and re-checked five more times. Measures what LinkedIn shows, not what a user happened to see.
Independent Verification
Two external analysts reproduced the core charts using the raw CSVs and confirmed the same trends.
Statistical Significance
All key differences showed p < 0.001 — in plain English, less than a one-in-a-thousand chance these gaps are luck.
  • You're not imagining it: the feed really does favor large, established voices.
  • Quality alone isn't enough: engagement helps, but hidden scores can cap your reach before anyone even sees your work.
  • Message framing matters: even neutral wording changes can swing visibility up or down by double-digit percentages.
  • Reach / Impressions — how many individual feeds a post lands in.
  • Authority Score — a hidden rating LinkedIn assigns to each publisher; higher score = more initial reach.
  • Decay Curve — a line that shows how quickly a post stops being shown over time.
  • A/B Test — publishing two versions of the same content to see which performs better.
  • Sentiment Heat-Map — a color grid that shows how positive or negative reactions cluster by topic.
  • Network Amplification — the extra reach a post gains when someone with a large or high-authority following shares it.

"Does engagement still matter?"
Yes, but only after LinkedIn gives your post an initial push. If that push is tiny, engagement can't work its magic.

"Should I stop posting?"
No. Understanding the system helps you frame stories in ways that travel further — while also pushing for fairer algorithms.

LinkedIn markets itself as a merit-based network, yet the numbers tell a different story: one where institutional weight and narrative safety nets decide who gets heard. I hope this plain-language breakdown helps creators understand the unseen forces at play — and sparks a bigger conversation about transparency in professional media.

~11 min read  •  n = 10,000 posts  •  R² = 0.901  •  p < 0.001