Today maybe one of those days in Internet history that goes down as a significant change in how things are done. Every so often an update or change in the ranking algorithms are so major that it effects the way the Internet is used. Google’s official release today of the ‘Farmer Update‘ most likely will be a signal of what’s to come and will be a dramatic change in how sites are ranked. Just less than 12% of the search results will be effected making this one of the biggest changes they’ve done in their history. Most changes effect less than 10% and in many cases, like last month’s ‘Scrapper Update‘, only effected around 2% of the results. Many times, the changes are so slight that most people will not even notice a difference. The game has changed today.
Google has been under a lot of fire lately with their results and how they display what they call ‘content farms‘ or low quality sites that don’t produce enough unique or original content. Their new update targets these sites and drops their rankings based on the low quality. Thus the reason this is being called the ‘Farmer Update‘. Looking at this new change, they are specifically targeting content that is either low quality or has been copied from another site. In addition, I believe this update loosens authority models build on trust and makes the results open game for any site and doesn’t just target large or small sites. In the past, many of the updates didn’t effect large sites that have authority or trust within their segment, but this new update doesn’t seem to follow that model and will target any content that they feel is not of quality. I’m seeing this happen specifically on sites that I work closely with and have a huge brand with lots of quality content and authority, but lost rankings and traffic after the update.
Over the past month or so, Google has been trying very hard to stay relevant and deliver a good search result. They are trying to clean up their search results and have been rolling updates quickly to address the concerns, but it makes me wonder if they are trying to move a little too fast and maybe actually hurting themselves. I for one, would love to get good results every time and applaud their efforts for trying to make the experience better, but to me one of the best models to use is user experience. People will not stay on and click through pages that have poor content and using this data will make a far better search result then trying to piece together results with band-aids and other factors that will bump up or down a piece of content based on what their algorithm feels is good quality.