In modern search marketing, Google is seemingly tweaking its algorithm on a daily basis. But how can you find out what happened, when it happened and how it has affected your site?
Luckily there are a host of tools out there that can give you detailed insights as to what changed in the Google algorithm and how it may have affected your site and the search space surrounding it.
One of the biggest tools is Moz’s ‘weather report’ tool Mozcast. Mozcast uses a weather system detailing how active the algorithm has been in the last five and thirty days based on how hot ‘the weather’ (Google’s algorithm) is that day.
Another tool that has earned some deserved attention is Algoroo. Algoroo does not operate in a weather system format; however it does offer similar insights into how the Google algorithm has changed recently and delivers it in a straightforward and easy to understand way.
Therefore this post will compare and contrast the two devices and judge which tool will be best equipped to fit in your search marketing toolkit.
Round 1 – Initial Thoughts
Whilst Mozcast’s weather system is an extremely creative way to show the differing algorithm points, it can be a little confusing and ‘too busy’ for beginners with lots of aspects demanding your attention.
The layout features the ‘weather’ for today, the last five days weather results, weather for the last thirty days, and different tabs explaining the metrics and features. A little confusing right? The layout is too cluttered as it tries to squeeze in all of this data on the one page instead of either splitting them up on different pages or placing the data beneath one another encouraging users to scroll down the page.
Algoroo on the other hand offers a much cleaner display with all the data you need presented on one static page.
In addition to this the Algoroo page indicates the date on the X axis of the graph, something Mozcast fails to show on their graph. The graph also visually shows where there have been spikes in algorithm activity by changing the colours from green to orange on the chart. This difference makes the Algoroo tool instantly a lot easier to interpret the data from a top level.
Plus Algoroo offers a much wider data set, allowing you to look at algorithm activity from January 2013 whereas Mozcast only offers data from ninety days ago.
Mozcast fights back however by offering data for the last five days on its homepage, whereas in comparison Algoroo only lists algorithm activity data from the previous day on its homepage.
Close first round, but the winner is: Algoroo
Round 2 – Speed and efficiency
With any tool that requires consistently fresh data and information, both of these tools need to hyper fast to keep your attention and immediately give you the data you need.
According to Mozcast their data is updated “every 24 hours” and they “track a hand-picked set of 1,000 keywords and grab the top 10 Google organic results”. The keywords are reportedly “tracked at roughly the same time every day from the same location”. This gives Mozcast an extremely efficient way or drawing and comparing the data to the previous day.
For example, if they were to draw the data from different times of the day then the activity may be skewed and a full understanding would not be available.
Algoroo currently has no information on the site for when it pulls its data and how fast it achieves this. One assumes that the data is also pulled on a 24 hour basis as it obviously is updated each day. However, having no evidence of when the data is actually generated, and under what circumstances, unlike Mozcast, Algoroo loses this round.
Based on the more detailed information, the winner is: Mozcast
Round 3 – Insights
Playing round with both tools for only a few minutes is enough to see that Mozcast has the superior insights and analysis available. There are multiple tools and graph that shows different perspectives and angles of the data whereas Algoroo only shows the basics on their homepage.
Quite rightly Moz highlights that “one number can’t tell the whole story of something as complex as the Google algorithm” and subsequently shows a plethora of graphs to try and understand as much of the algorithm activity as possible.
Mozcast boasts analysis on domain diversity, SERP count, EMD influence, PMD influence and even the daily big 10 influential sites:
Only one winner in this round: Mozcast.
Round 4 – Analysing the data
Now we come to the most important part about both of these tools, the data.
Both sites update regularly and it is safe to assume that they both have different ways to calculate the Google’s algorithm activity. At least that could be the reason why there is a significant difference between both tools on certain dates.
For the first example I looked at the 19th of May:
As you can see there is quite a clear difference between both sets of data. Whilst the two tools both show that there was indeed a spike in activity on this day, it is Mozcast that considers the spike to be a lot more powerful and significant than the Algoroo tool.
Evidently we must look into this date further and see how extreme the spike actually was and see which tool measured it correctly.
After a bit of exploring, it appears there was some chatter the weekend before the 19th of May, with many saying they had spotted fluctuations in the SERP’s.
Barry Schwartz from seroundtable.com published an article saying that while there was no conclusive proof, he felt it was almost certain that Google was up to something big. What that could have been is anyone’s guess, although Schwartz curated rumours saying “some are suspecting a massive Penguin update is about to hit, while others think it might be a Panda refresh and others think it is just Google’s normal actions on link networks”.
On May 21st however it was confirmed by Search Engine Watch that Panda 4.0 had been launched. Was this the reason there was such a spike in algorithm activity just two days before? Either way it confirms that Mozcast was right to present such a significant spike and also lead us to wonder why Algoroo showed such an underwhelming spike.
In order to see whether this was just a onetime underestimation from Algoroo, we must dig a little deeper by looking at more dates.
Let’s look at the aforementioned Panda 4.0 update to see how both tools reported the updates release.
Mozcast did indeed spot the rollout of Panda 4.0 however it was a lot less severe when compared to Algoroo.
We can see Algoroo reported the activity spike to be a lot more severe than Mozcast. Was this another strange occurrence? Let us investigate further.
The next date we are going to look at is 8th May 2014:
Once again we can see two different levels of activity from the two tools on the same day. Mozcast shows that nothing of any real note happened on that day with the bar chart falling just under 60 on their temperature scale. On the other hand, Algoroo reported that there was a significant spike in activity on this day according to their calculation algorithm. But which tool was the most accurate? Once more it is important to dig deeper and find out.
Once again Barry Schwartz was on the case at Search Engine Land with this post where he said “I’ve asked Google if there was, indeed, an update, and Google would not confirm.”
Schwartz was confident there was something going on asking “Was it an algorithmic update? Was it Penguin? Was it a small Panda refresh? Was it a new algorithm? Maybe Google is testing an algorithm to a select subset of searchers”
Schwartz continued “Without Google confirming the update and telling us specifically what changed, it is impossible for me to tell you with certainty if there was an update and what the update was. All I can say is that there are many webmasters and SEOs talking as if there was some sort of Google update.”
So it appears there was sizeable activity that got a lot of people talking. This conversation authenticates Algaroo’s activity spike on their algorithm tool and finds us now asking why Mozcast did not pick up on it too.
The widely respected Moz resource Google Algorithm Change History strangely does not show any recent updates or much talked about spikes for May at all. Instead, the last update they discuss on the resource was on March 24th.
It is certainly strange that Moz has not included the two May activity occurrences we have highlighted in their resource with no explanation why.
In order to try and find out why they have not mentioned them, and why the May 8th spike didn’t register as high on Mozcast, I contacted Moz via Twitter:
After a few hours, Moz got in touch and replied with the following:
It is interesting that the update was on May 8th, yet they have still not updated their resource. Does Moz feel the activity was not as big as Algoroo after all?
I also wanted to see if the data from Moz’s Google Algorithm Change History resource matched up to Algoroo tool to see if there were any discrepancies. The blue is for Moz and the red is for Algoroo. In the instances where there is blue and red together, this means both spotted the same update.
The findings are pretty interesting. We can see that both Moz and Algoroo found the Penguin 2.1 update along with the Hummingbird update. However, Algoroo spotted three additional spikes of activity on 27th November, 17th January and 24th January. After looking round the web I found a few mentions in forums for the 27th November update and some conversations on the January activity too.
We can see that the Hummingbird update, which apparently affected about 90% of searches worldwide, was understandably pretty sizable. However it is interesting that the Penguin 2.1 update, which was said to only affect 1% of queries, also had a large spike. These findings then suggest that Algoroo’s graph does not successfully represent the differing effect each update had on search queries.
Mozcast reported Penguin 2.1 with a sizeable spike too at 72 degrees:
This is interesting to see it has a high spike despite only affecting 1% of search queries.
However for the Hummingbird update, we can see a ginormous spike for the activity that affected 90% of search queries:
In conclusion, it looks like Mozcast has more realistic spikes on their graphs when compared to Algoroo.
As I mentioned earlier in this post, it is more than likely that both of the tools have two separate calculation formulas to interpret Google’s algorithm activity. It is interesting that they both pick up different spikes, however it is surprising that both noticed the May 19th spike (to differing degrees of severity) but only Algoroo noticed the May 8th activity spike.
Close again but the winner has to be: Algoroo.
And the winner is…
A close fought battle between two very respected and efficient tools with both have their positives and negatives.
Moz offers a lot deeper insights, yet the tools layout is a lot less straightforward when compared to Algoroo. Algoroo offers significantly less analysis, but it picked up on a decent sized algorithm spike that Mozcast missed.
If I had to choose between the two, I wouldn’t. Instead I would use both, as you can never have too many insightful tools as search marketer. Plus it is always recommended to verify data with another source in order to validate and confirm your first impressions.