Skip to main content

HomeProblems › Competitors Outrank Me With Worse Reviews

Stan Consulting · Problem · Construction Marketing

My competitor has 24 reviews. I have 87. He outranks me anyway.

Competitor with 24 reviews outranks you with 87. Here is why, and what to do about it. The 7-day diagnostic finds the gap for $999.

Get the Diagnostic · $999

The complaint

The version of this you would write after losing a job you should have won.

You searched your own trade in your own city. The first listing in the map pack is a guy you have never heard of. 24 reviews. Two of them are three-star. The reviews include phrases like "ok work" and "tried to charge extra after." You scrolled. You found yourself at rank five. 87 reviews. 4.9 average. Half a decade of consistent five-stars from real customers in your zip.

You called your nephew who does some SEO. He said reviews matter most and you have more. You said the guy at rank one has less. He said "weird." You said weirder, you have a better trade match and a longer history and a real office and his profile shows a residential address. Your nephew said maybe ask for more reviews. You closed the call.

You looked at the guy at rank one again. He has photos uploaded last week. You uploaded photos last quarter. He posts on the GBP every Friday. You have posted twice this year. He responds to every review inside 48 hours. You respond to about half of yours, eventually.

You realized the algorithm is not rewarding the volume of your reviews. It is rewarding his consistency. You did the math. He is closing maybe 4 to 6 jobs a week off the map pack. You are closing maybe one. He is doing it with a worse profile, a worse track record, and a worse rating, because his profile is alive and yours has been sitting still for two years.

You opened the GBP dashboard. You stared at the post-creation button. You did not know what to write. You closed the tab. The phone is quiet.

What you already tried

Things you already did. None of them moved the rank.

  1. Asked your last 10 customers for reviews. Eight left them. Your average held at 4.9. Your rank did not change. Volume without other signals does not move a rank that is gated by engagement.
  2. Reported the competitor's listing for the residential address. Google reviewed and kept the listing. Address verification rules vary by trade and Google does not always remove what looks suspicious from the outside.
  3. Asked your marketing guy to "do more SEO." He raised his retainer and added blog content. The blog content does not affect the local pack much. The retainer increase had no effect on rank.
  4. Got more citations on directory sites. Citations help a little. Citations alone do not close the engagement gap with the top-3.
  5. Switched the primary category to be more specific. The new category had less search volume than the old one. Rank moved in the wrong direction. You moved it back.

The diagnostic questions

Seven questions to answer alone. The answers point at the gap.

This is where the page changes register. Answer these on paper. Most local-pack gaps have never been read this way.

  1. How often do the top-3 competitors in your map pack post on their GBP, and how often do you?
  2. What is the owner-response rate on the top-3's last 50 reviews, and what is yours?
  3. How recent is the most recent photo upload on the top-3, and how recent is yours?
  4. What is the average age of the top-3's last 20 reviews, and what is yours?
  5. Does the top-3's primary category match the buyer-search query you are trying to win? Does yours?
  6. How many Q&A questions has the top-3 answered on their GBP, and how many have you answered on yours?
  7. If the engagement signals were equalized to the top-3 over 90 days, what would that do to your rank and your closed-job pipeline?

If five of these come back blank, the engagement gap has never been measured. The audit measures it in seven days against the actual top-3 in your local pack.

What is actually happening

What the audit finds in cases like this.

The voice shifts from here. This is the structural read. Five things show up in almost every case of "outranked by worse reviews."

  1. The engagement signal beats the review count. The top-3 are posting, responding, uploading, and answering. The local-pack algorithm reads engagement as a freshness signal and weights it more heavily than raw count. See GBP Is the Local Pack Battle, Not the Website.
  2. The review velocity is uneven. 87 reviews collected over 5 years averages out to one new review every three weeks. The top-3 are getting one new review every 7 to 10 days. Recent velocity reads as a stronger signal than total volume.
  3. The category match is off by one degree. The buyer is searching "roofer near me" and the top-3 is set to "Roofing Contractor." Your profile is set to "Construction Company" with "roofing" listed as a service. The category match is where the ranking is decided before any other signal applies.
  4. The owner-response rate is documented as an engagement input. The top-3 respond to every review, sometimes within hours. The response is read as activity by the algorithm and as social proof by buyers. The dual benefit compounds over months.
  5. The photo cadence is the cheapest lever and the one almost no contractor pulls. A new photo every 14 days correlates with top-3 ranking in most markets. Most contractor profiles have not uploaded a photo in the last 90 days.

The three layers to read

What the diagnostic actually scores against.

01

The category-match layer

Primary category, secondary categories, services. Whether your profile matches the buyer-search query the way the top-3's profile does.

Read the Reference →

02

The engagement layer

Posts cadence, photo cadence, Q&A response, owner-response rate on reviews. The four signals the algorithm reads as activity.

Read the Reference →

03

The recency layer

Review velocity, photo recency, post recency. The freshness signal that beats raw volume in most local-pack markets.

Read the Position →

What most contractors get wrong here

Three readings that look right and are off by a mile.

  1. Misreading 01

    "The competitor is gaming the system."

    Sometimes. More often the competitor is doing the basic engagement work that nobody else is doing. The ranking is responsive to the work, not to the gaming.

  2. Misreading 02

    "I just need more reviews."

    More reviews help. They do not close a recency, engagement, or category-match gap. Reviews are one signal among many and rarely the binding constraint when the gap exists.

  3. Misreading 03

    "This will sort itself out as Google updates."

    It will sort itself out toward whoever is doing the work. Operators who fix engagement signals tend to hold rank through future updates better than operators who do not. Patience without action is a strategy to stay at rank five.

What gets diagnosed

The seven readings inside a 7-day audit.

Top-3 competitor profile read. Category, services, photo cadence, post cadence.
Your profile read against the top-3 on the same dimensions.
Review velocity comparison. Recent volume against top-3 recent volume.
Owner-response rate audit. Yours and the top-3's.
Category-match audit against your top three buyer-search queries.
NAP consistency across major directories. Where data disagrees.
Three prioritized moves with the largest expected lift on rank, and the 30/60/90-day timing estimate.

What you get

The value stack at $999.

  1. Written diagnostic report

    Seven days. PDF and editable doc. Three named moves with the rank-recovery timing estimate.

    $2,400 value
  2. Top-3 competitor benchmark

    Full read of the top-3 in your local pack. Posts, photos, Q&A, response rate, review velocity. The numbers you are competing against.

    $900 value
  3. Your-profile gap report

    Side-by-side against the top-3 on every engagement and recency signal. The exact gap by signal.

    $700 value
  4. Category and service-match audit

    Your top three buyer-search queries against your profile categories. Recommended revisions.

    $500 value
  5. 90-day engagement plan

    Posts cadence, photo cadence, response-rate target, review-velocity target. Step-by-step and named for your team.

    $1,100 value
  6. 60-day follow-up review call

    One hour. Re-measure rank. Check the moves that landed. Name what to do next.

    $400 value

Total named value: $6,000. Price: $999. The math defends in 15 seconds.

What you are already paying

Price math against the alternatives in your inbox right now.

Lost local-pack pipeline

$60K+/yr

Rank 5 vs rank 1 means 93% fewer inquiries per the industry stat. The yearly gap is rarely under five figures.

SEO agency retainer

$1,800/mo

Blog content, citations, monthly report. Most retainers do not work the engagement layer where the rank actually lives.

The diagnostic

$999

One time. Seven days. Written report you own. Three named moves. Keep it whether you hire us or not.

Common questions

On record.

Why do competitors outrank me with fewer reviews?

Review count is one of many local-pack ranking signals. Engagement signals, NAP consistency, category match, proximity to the searcher, photo cadence, owner response rate, and review recency all contribute. A profile with fewer but recent and engaged reviews often outranks one with many older, lower-engagement reviews.

Should I just stop asking for reviews then?

No. Keep asking. The fix is to add the missing signals on top of the reviews you already have. The diagnostic identifies which signals the top-3 competitors have that you do not.

Is this fixable in 30 days?

Most ranking gaps caused by engagement signals are fixable in 30 to 90 days. Gaps caused by category match or NAP inconsistency are fixable in 7 to 21 days. The diagnostic names which case applies to you.

Do you do the GBP work after the audit?

Yes. If the diagnostic shows that ongoing management is the right next step, that engagement is available. Most contractors start with the $999 diagnostic and decide from there.

Do you work outside California?

Yes. Stan Consulting works with construction operators across the United States. The office is in Roseville, California.

What if the competitor is using fake reviews?

Some are. Most are not. Engagement signals beat fake reviews in most markets because Google has gotten better at detecting velocity anomalies. The audit names whether reporting the competitor is the right move or a distraction.

Will more photos really change my rank?

In most markets, yes. Photo cadence is one of the cheapest signals to move and one of the most consistently correlated with top-3 ranking. The audit measures your specific market.

The engagement format

Close the rank gap with the top-3 in your zip.

Seven days. Written report. Three named moves. Side-by-side against the top-3 in your local pack on every engagement and recency signal. You keep it whether you hire us or not. The math defends in 15 seconds and the work that moves the rank is on the page.

Get the Diagnostic · $999 Or write with one specific question first.

The rank is not random. The gap is just unmeasured.

Related reading · Marketing Atlas

If you want the structural reading before the audit.

California operators

Construction operators near our Roseville office.