When shopping online, checking out a hotel on an app, or scrutinising restaurants in the local area, many of us will go to the review section to sway our decision on whether to buy or use the product or service. It is quick and easy to take a gander through the comments, to see star ratings and make, or not make, your purchase. It is equally as easy to leave a comment yourself so that, first, your likes and dislikes are highlighted to the seller and, second, to emphasise your opinions so that possible future buyers/users may have noteworthy information.
These days, however, we are bombarded with reviews applauding how dingy cafes are hidden secrets, how that pub in Slough is one hundred times better than the Turin Palace Hotel, and how a new fashion magazine surpasses the writings of Wordsworth, Ulysses and Pope. We live in a manipulative world, and reviews, comments and star ratings have become an increasingly useful sales tool to persuade consumers. The higher the star ratings and the greater the number of positive comments, the higher the sales- fact!
A study made by Harvard Business School even highlighted that a single additional star on places such as TripAdvisor can increase revenues by up to 9 percent. As a result, our internet has been showered with falsity, promoting products with fake reviews for as little as £5. Any business can pay someone to write a glowing review of their product, and only recently a comical genius by the name of Oobah Butler (a possible alias?) wanted to demonstrate just how phoney the comments on websites such as TripAdvisor are. Butler had a career in creating fake reviews himself but wanted to take it a step further. Rather than just highlighting fake reviews, Butler not only created a fake restaurant online, but succeeded in getting it as a top-rated listing on TripAdvisor. With a new phone number, and an “appointment –only” system, so as to avoid supplying an address, he caused intrigue and excitement. His friends left glowing five-star reviews on a weekly basis to the elegant and vibrant ‘Shed at Dulwich’. Photographs of ‘delicious meals’, which actually encompassed shaving foam, a power bleach tablet and honey, were scattered all-over social media. With such splendid reviews, hundreds of people, including celebrities, were calling day and night in the hopes of making a reservation to a back garden shed on the outskirts of London.
Thus, not only has the issue of fake reviews rocketed in recent years but, in a bid to keep up with the competition/fakery, small businesses, in particular, have had to create or pay people to keep their stars up to avoid losing to big corporations. The thinking in business seems to be ‘create five-star reviews in any way you can or lose service with a mediocre 3-star appraisal’.
Such fake reviews are, however, hard to spot. When a panel of three specialist judges gathered at Cornell University to identify 400 fake reviews from a total of 800 reviews, not one could distinguish a fake review from a genuine one.
So how do we, at Pansensic, distinguish between genuine and fake reviews?
Well, when we analyse a large number of reviews, we use emotion analytics to give an emotion score. The normal distribution would look something like the graph depicted below. As you can see, with one to five stars placed on the X-axis, the distribution is fairly even, with a smooth rise and fall.
However, if people are being paid to write good reviews about a product or service, then the normal distribution would alter to look something more like this.
This uneven distribution highlights any discrepancies when there is a significant amount of positive, or indeed negative, feedback of a product. Of course, 80% of all reviews are usually four stars plus, so the line on the graph above would actually shift a little to the right, however the positive and negative distribution should still create a graph with a shape similar to the first one shown, while those with rigged reviews/comments would show half the usual distribution.
At a more detailed level, after highlighting any discrepancy, we can then identify fake reviews individually. The simplest way to identifying these false criticisms is by detecting any duplicates, known in the IT world as ‘dupes’. So, let’s say that someone was paid to create and distribute one hundred false reviews; the easiest way to do this would be to write a positive review and copy and paste the same comment one hundred times on other sites. This, however, would be considered unsophisticated, so instead, two or three words are usually changed for each new post. Despite these changes, we have the expertise to single out these words.
From our experience, those who are paid to write false reviews will, generally, change the first or last word or two, but the body of the text will remain the same; this is because it is quicker to change than finding something to change in the bulk of the message. What we then look for is the name of the author. On websites, take Amazon for instance, dupes are predominantly written by the same writer. And unless this author is clever enough to create one hundred different users, and generate hundreds of very different reviews, we will spot them. For someone to take the time and effort to create different users and write differing comments for every post is a rarity as, more often than not, these people are paid per comment and as long as these comments fulfil the requirements of praising a product, company or place, then it does not matter to the individual posting the review that they may be writing the exact same thing for fifty differing places.
Some readers may be wondering what happens if two messages just coincidentally happen to have the same text? Well, this is a huge improbability. Say, for arguments sake, you have a message of forty characters, and a duplicate message saying the same thing. There are twenty-six letters in the alphabet, so for every letter, you have one in twenty-six chances of that letter being the same as the dupe. The chances of the first letter being the same in both messages are one in twenty-six, this is the same for the second letter, and so on. By the time you get to the fortieth character, you’ve got twenty-six to the power of forty (3.9713112e+56) chances that both messages will be exactly the same. This doesn’t even take into consideration digits, capital letters, or any other form of punctuation. So, the likelihood of a message, especially one as long as forty characters, being duplicated by mistake is highly improbable.
Also, from our analysis, duplicates of more than around forty characters are particularly uncommon, namely because forty characters would take longer to type than ten, and for those being paid to praise or condemn a product, it is often tricky and time-consuming to write about a product/place they actually know little about.
Thankfully, fake reviews are a rarity within the data we analyse, but as they become a larger issue to tackle on review sites, they do crop up from time-to-time. Without providing any names, in one contract, we were asked to analyse over 300,000 reviews of a Japanese brand of robot hoovers. Here we found that the same comments had not only been repeated but duplicated in other review sites with either the first or second letter altered in each review.
If you want to try and distinguish between genuine and fake reviews online when shopping, or even just looking at a product, here are our five top tips!
- Keep an eye out for the author, see how many posts they have made and whether or not these posts all sound strangely similar and use the same kind of language. Also, check to see when each post was made. If he/she hasn’t made a single post in the last four months, yet has made 200 in the last week, air on the side of caution.
- When reviewers are commenting on a product, the name of the product will usually be shortened. For instance, if I was to write a review on British Airways, rather than to repeat the full name each time I would simply type BA. Contrariwise, a fake reviewer would write the whole thing out, and as many times as possible in order to boost the SEO (search engine optimisation) for that item.
- Watch out for the tone of the comment. If there are a million adjectives thrown in there, yet the description of this amazing, fantastic, glowing product is vague, be sure they have no idea what they are talking about!
- Look for extremes!!!!!!! If the majority of reviews of a single product are three to four stars, then you get a one-star saying atrocious things, this could be the work of a rival company trying to lower their competitor’s business…. what we classify as a “sneaky-douchebag move”. Equally, if a product has a track record of pretty low reviews, and then suddenly gets a glowing report, be wary.
- Check the ‘Verified Purchase’ If the item was bought from a site, such as Amazon, it will show you weather the reviewer actually purchased the item or nor under their name. No purchase, no validity to their comment.
These pointers may help at home, but for those of you who want to decipher the number of fake reviews posted about your company or product, and what these fake reviews are saying, the data analysts at Pansensic can help you in analysing reviews. We work with organisations from all over the world in a wide range of sectors. What they have in common is the understanding that the experiences of their customers, staff and stakeholders can provide game-changing insights. We are able to not just simply summarise data but provide a real depth of insight that is more granular, more accurate and more actionable.