Your site just went down
, your robots.txt all of a sudden excludes thousands of pages from crawlers
, you messed up your canonical tags. All problems most SEOs face during the first years on the job. But how to avoid these issues, let's do a thread.



Most of these common issues are caused by releases, deployments of new code
, or updating a plugin (in case it's WP or similar). Especially in larger organizations
that happens all the time, and there's nothing wrong with that. In the right setup, it usually helps SEO.



If you're a larger organization and you rely on SEO as an important source of traffic it's important to be aware of changes.

At @RVshare + @Postmates we leveraged tools to do so, for others we created our own scripts to measure changes, for the majority we used http://SEORadar.com . In the end, most orgs don't have their own additional infrastructure to support this.

Tools like those can help you keep track of things that break quickly, run daily crawls so you don't have to run a checklist every week to see if your product pages still contain http://Schema.org or have the right canonical.

Over time
you're hopefully able to build this into your workflow and can trigger certain checks upon certain deploys, however, there are only a few companies out there that are able to effectively do that today. Summary: Keep track of SEO changes to avoid big issues.

