When shopping online or checking out a hotel on an app, many of us will go to the review section to sway our decision on whether to buy a product or use a service.
It is quick and easy to take a gander through the comments. See star ratings and choose to make a purchase. Star ratings give you the chance to leave a comment so that your likes and dislikes are highlighted to the seller. It is equally useful to emphasise your opinions so that future buyers have noteworthy information.
These days, however, we are bombarded with false reviews applauding how dingy cafes are hidden secrets. How that pub in Slough is better than the Turin Palace Hotel. Or how a new fashion magazine surpasses the intellect of Pope.
We live in a manipulative world. Reviews, comments and star ratings have become an increasingly useful sales tool to persuade consumers. The higher the star ratings and the greater the number of positive comments, the higher the sales. Fact.
Real or Fake?
A study made by Harvard Business School highlights that a single additional star on TripAdvisor can increase revenues by up to 9%. As a result, our internet has been showered with falsity, promoting products with fake reviews for as little as £5.
Any business can pay someone to write a glowing review of their product. A comical genius by the name of Oobah Butler wanted to demonstrate just how phoney review comments are. Butler created a fake restaurant online, and succeeded in getting it as a top-rated listing on TripAdvisor.
With a new phone number, and an “appointment –only” system, so as to avoid supplying an address, he caused intrigue and excitement. His friends left glowing five-star reviews on a weekly basis. Photographs of ‘delicious meals’, which encompassed shaving foam, a power bleach tablet and honey, were scattered all-over social media.
With such splendid reviews, hundreds of people, including celebrities, were calling day and night in the hope of making a reservation. A reservation to a back garden shed on the outskirts of London.
Fake reviews have rocketed in recent years. In a bid to keep up with the competition small businesses, in particular, have payed people to keep their stars up. Or avoid losing business to big corporations. Create five-star reviews in any way you can or lose service with a mediocre 3-star appraisal.
These fake reviews are, however, hard to spot. When a panel of three specialists at Cornell University tried to identify 400 fake reviews from a total of 800, not one could distinguish between a fake review from a genuine.
How do Pansensic tell the difference?
When we analyse a large number of reviews, we use emotion analytics to give an emotion score. The normal distribution would look something like the graph below. As you can see, the distribution is fairly even, with a smooth rise and fall.
However, if people are being paid to write good reviews about a product or service, then the normal distribution would alter to look something like this.
What the data shows
This uneven distribution highlights any discrepancies when there is a significant amount of positive or negative feedback. Of course, 80% of all reviews are usually four stars plus. So, the line on the graph above would actually shift a little to the right. However, the positive and negative distribution should still create a graph with a shape similar to the first graph. While those with rigged reviews/comments would show half the usual distribution.
At a more detailed level, after highlighting any discrepancy, we can identify false reviews individually. The simplest way to spot these is by detecting any duplicates, or ‘dupes’. Let’s say that someone was paid to create and distribute one hundred false reviews. The easiest way to do this would be to write a positive review and copy and paste it one hundred times on other sites. This approach, however, is unsophisticated. So, two or three words are usually changed for each new post.
We have the expertise to single out these words.
From our experience, those who are paid to write false reviews will, generally, change the first or last word or two. But the body of the text will remain the same. This is because it is quicker to change this, than finding something to change in the bulk of the message. What we then look for is the name of the author. On websites dupes are predominantly formulated by the same writer. Unless this author is clever enough to create one hundred different users, and generate hundreds of very different reviews, we will spot them.
For someone to take the time to create different users and write differing comments for every post is a rarity. More often than not, these people are paid per comment. It does not matter to the individual posting that they may be writing the exact same thing for fifty differing places.
But what happens if two genuine messages are the same?
This is a huge improbability. Say you have a message of forty characters, and a duplicate message saying the same. There are twenty-six letters in the alphabet. So, for every letter, you have one in twenty-six chances of that letter being the same as the dupe. The chances of the first letter being the same in both messages are one in twenty-six. This is the same for the second letter, and so on. By the time you get to the fortieth character of your message, you’ve got twenty-six to the power of forty (3.9713112e+56) chances that both messages will be exactly the same.
This doesn’t even take into consideration digits, capital letters, or other forms of punctuation. So, the likelihood of a message, especially one as long as forty characters, being duplicated by mistake is highly improbable.
Thankfully, false reviews are a rarity within the data we analyse. But as they become a larger issue on review sites, they do crop up from time-to-time.
For instance, we were asked to analyse over 300,000 reviews of a Japanese brand of robot hoovers. Here we found that the same comments had not only been repeated, but duplicated in other review sites. Again, either the first or second letters were altered in each review.
Here are our tips to distinguish between genuine and false yourself.
- Look for the author. See how many posts they have made and whether or not these posts use the same language. Check to see when each post was made. If he/she hasn’t made a post in the last four months, but has made 200 in the last week, beware.
- When reviewers are commenting on a product, the name of the product will usually be shortened. If I was to write a review on British Airways I would simply type BA. A fake reviewer would write the whole thing out as often as possible to boost the SEO (search engine optimisation) for that item.
- Watch out for the tone of voice. If there are a million adjectives thrown in, yet the description of this amazing, fantastic, glowing product is vague, be sure they have no idea what they are talking about.
- Look for extremes!!!!!!! If the majority of reviews are three to four stars, then you get a one-star saying atrocious things, this could be the work of a rival company trying to lower competing business. Equally, if a product has a track record of pretty low reviews, and then suddenly gets a glowing report, be wary.
- Check the ‘Verified Purchase’. If the item was bought from a site, it will show you weather the reviewer actually purchased the item or not. No purchase, no validity to their comment.
How do you spot fake reviews in your business?
These pointers may help at home. But for those who want to spot fake reviews posted about their company or product, Pansensic’s data analysis can help.
We work with organisations from all over the world in a wide range of sectors. What they have in common is the understanding that the experiences of their customers, staff and stakeholders can provide game-changing insights.