- This topic has 8 replies, 1 voice, and was last updated 6 months, 3 weeks ago by jakethedogo.
March 5, 2020 at 6:22 pm #1791seohelperKeymaster
I gathered 13,591 reviews across 218 dental practices to analyze and discover possible fake reviews. I’d like to share with you what I found.
My 4 main metrics for finding possible fakes are:
* Review velocity
* How fast are reviews coming in for this dental practice? A 1.0 would mean you can expect to see roughly 1 new review pop up per day.
* Review word count
* How many words are typically in a review?
* Number of reviews a reviewer has
* What’s the median number of reviews the reviewers of a business have? What percentage of those reviews come from people that have only ever posted the one review?
* Number of reviews in total
* Not an indicator of anything by itself usually, but draws us to look at those practices with a ton of reviews.
Taking all these into account, I was able to find some “interesting” behavior. Firstly, we need to know what’s normal.
For review velocity, across 218 dental practices, the median velocity is 0.03 new reviews per day. The median word count is 25 words for reviews of a typical dentist practice. The percentage of reviewers who only have 1 review is 47% for a typical practice.
Now, on to some outliers.
One practice with 362 reviews has:
* Review velocity of 0.91.
* Median word count of 1.
* 75% of their reviewers are one-time reviewers.
Another example with 625 reviews has:
* Review velocity of 0.61.
* Median word count of 1.
* 63% of their reviewers are one-time reviewers.
These are just a few examples of businesses showing outlier behavior.
What experiences does everyone else have with finding suspected fake reviews? Do you see this a lot, and what tactics do you use to find them?
PS: I have graphs that show the situation more clearly, if you’d like to see them just PM or ask in the comments!March 5, 2020 at 8:18 pm #1793onemananswerfactory
So…. do fake reviews help or hurt the website and/or GMB page?March 5, 2020 at 9:05 pm #1792toastshop
On a related note, Trustpilot are scum. Enemy of honest businesses, friend of the scammers.March 5, 2020 at 10:16 pm #1796TryAgainNextWeek
This doesn’t account for review generation campaigns like Grade.Us and other review apps that push for reviews.
Often times people will forget or not really care to review something, and with a nudge they might.
There are plenty of fake reviews out there, but this wouldn’t account for that.
I’d try to scrape the reviews into a CSV and the run a =counta() and search for localSEO keywords. No sane average person actually includes those in their review; that could help clear up your noise in your data.March 5, 2020 at 11:33 pm #1794salimfadhley
Do you have a dataset of reviews anywhere?
I’d be interested to look at bigram and trigram word frequencies. I wonder if reviews that share unusual bigram and trigram combinations appear fake?March 6, 2020 at 12:18 am #1797axdigitalru
Great!March 6, 2020 at 7:43 am #1798LaCharentaise
Thanks for sharing these insights !March 6, 2020 at 11:31 am #1795Nyxnik
Interesting, but the question is how does this affect positioning in Google / Maps. It would be interesting to take a look at those businesses with the higher amount of reviews and that are suspicious and check their situation.March 6, 2020 at 7:26 pm #1799jakethedogo
Out of curiosity- what would your median word count be if you filtered out no-text reviews? Both for all practices and the outliers.
- You must be logged in to reply to this topic.