Why Google Can't Explain Every Ranking Drop?

Why Google Can't Explain Every Ranking Drop?

In a recent Twitter exchange, Google’s Search Liaison, Danny Sullivan, shed light on how Google manages algorithmic spam actions and ranking drops. This discussion was initiated by a website owner's complaint about a significant loss in traffic and the lack of an option for a manual review.

Sullivan clarified that a site's drop in rankings could result from an algorithmic spam action or other factors unrelated to spam. He emphasized that many site owners mistakenly attribute ranking drops to algorithmic spam actions when that is often not the case.

“I’ve reviewed numerous sites where owners complained about ranking losses, assuming they were hit by an algorithmic spam action, but that wasn’t true,” Sullivan stated.

Challenges In Transparency And Manual Intervention

Sullivan acknowledged the desire for greater transparency in Google Search Console, such as notifying site owners about algorithmic actions similar to manual actions. However, he highlighted two major challenges:

  • Preventing System Manipulation: Revealing specific algorithmic spam indicators could enable bad actors to manipulate the system.
  • Nature of Algorithmic Actions: Algorithmic actions are not site-specific and cannot be manually lifted.

While Sullivan sympathized with the frustration of not knowing the cause of a traffic drop, he cautioned against the notion that manual intervention would solve the issue. He argued that seeking manual review could be problematic.

“You don’t really want to think, ‘Oh, I just wish I had a manual action; that would be so much easier.’ You don’t want your site to attract our spam analysts’ attention. Manual actions aren’t instantly processed, and it marks your site going forward, especially if it claims to have changed but hasn’t,” he explained.

Determining Content Helpfulness And Reliability

Moving beyond spam issues, Sullivan discussed various systems Google uses to assess the helpfulness, usefulness, and reliability of content. He acknowledged that these systems are not perfect, and sometimes, high-quality sites might not rank as well as they should.

“Some sites ranking well might see a slight drop, causing a notable traffic decrease. They might assume they have fundamental issues when they don’t, which is why we added a section on our debugging traffic drops page,” Sullivan noted.

He also mentioned ongoing discussions about providing more indicators in the Search Console to help creators understand their content’s performance. However, he noted the challenge of preventing system manipulation while offering useful insights.

“I’ve been discussing whether we could show more indicators in Search Console. It’s challenging, like with spam, as we don’t want the systems to be gamed. There’s no simple button to push to make a site rank better. But we’re looking for ways to help creators with better guidance,” Sullivan added.

Advocacy For Small Publishers And Positive Progress

In response to a suggestion from Brandon Saltalamacchia, founder of RetroDodo, about manually reviewing “good” sites and offering guidance, Sullivan shared potential solutions. He mentioned exploring ideas such as self-declaration through structured data for small publishers and using that information to make improvements.

“I’ve been considering and proposing ideas for small publishers to self-declare with structured data. We could learn from that and use it in various ways. It’s early days and no promises, but I hope we can move forward positively,” Sullivan said.

Sullivan concluded by expressing hope for future improvements but cautioned that changes can’t be implemented overnight. He remains optimistic about finding ways to assist site owners and improve the system.

To view or add a comment, sign in

More articles by Wibits Web Solutions LLP

Insights from the community

Others also viewed

Explore topics