9 Fatal SEO Mistakes That Will Obliterate Your Search Traffic

Written by

If you go into Google and search for bad stuff that will hurt your SEO, you will find generic and sh*t content. 

And the post ranking number one, should say that, "Not using Google Webmaster Tools "will hurt your website rankings."

So I made a list of actual events that will destroy your website.

Number one, renaming a page. Some websites systems change the URL of the page when you rename. Please do not change the URL of your product or your other landing pages. If you ever must change an URL, then make sure you are 100% certain of what you are doing in your redirects. So you're probably thinking this is so basic advice, but it happens to everyone. And all of the traffic to this page might be lost especially if you don't catch this in time, you will probably never regain what you lost. Number two of my list, it's launching a new web design. Besides just being a risky thing to do in SEO in general, launching a new website can leave your traffic dead overnight. I've seen three ways that this can hurt your website. First way, a Progressive Web App, and before any of us tell me that Google can index client side-rendered Progressive Web Apps, then yes they can. But if your site used to be server side-rendered and now it's client side-rendered, via something called View or React or some other JavaScript library. Then there's a high probability that everything Google sees is nothing. It's just a blank page in, nothing at all. This can make damages to your sites that's very hard to come back from, and at some point Google will be able to index your website again if you stick with this client side-rendered. But in the meantime you might have lost something you will never regain. The second way, a system change. Some systems use like fixed URL structures and some don't. Using Shopify, you are sticked with something called collections and pages inside your URLs, whereas if you use Wipers or Magento you don't have to have the same URLs. And by moving from one system to another system, you can loose everything if you are not 100% certain that your redirects are done correctly, even if they are done 100% correctly. I have seen some websites that take a dip and do not regain what is lost before doing new SEO work. When you redirect a page, this has to be similar, the same or better content for the same search intent, and if you redirect to something that Google does not consider similar or better, then Google can disregard this redirect entirely. And the third way is with the no index tag in your header. When developers are making changes or creating a new website, some are using robots. txt and some are using meta tags to make sure that Google does not index the content. The theory of this is a very good thing, but this is pretty dangerous. I have seen these sites launched, still with the meta tags, no index. A much better way to do is testing site or staging site or what you might call it, is to IP restrict the entire server itself. Then no one besides the allowed IPs can access this restricted server, and when you deploy the code onto the real live server then everything just works as it should be. Number three, site security's breach. The tiny thing you can do wrong that will completely destroy everything you worked on, and WordPress site in particular is very easily broken into, and this is not because someone is specifically targeting your site, it is bots that's targeting known exploits of your system. You have to keep your system updated and your site security must be on point. Two-factor is advised. Don't go, the video is not at all done yet. I also interviewed some SEO experts to talk to them and to ask them about your SEO mistakes that had consequences. Sit back, relax, and enjoy the experts talking.

- When it comes to SEO mistakes, there's a lot of things you can do wrong, but something I've been seeing over the past couple of years, particularly with some bigger clients has to do with acquisition. So people are either getting acquired or being acquired, and when they're getting rid of the old site, they're just redirecting the entire thing to the new site. So one of my recent clients actually was acquired by a bigger company. They had fantastic rankings, right? We had worked on the site for years and years and years. The new company decided that they were just gonna redirect the whole thing. So, instead of keeping those rankings, they lost everything. So if you're gonna do that, take the time to map those and redirect them to applicable pages 'cause there's a huge opportunity.

- Today I'm gonna talk about two examples of mistakes clients have made, that's impact on their SEO rankings. So my first example is a website migration. The worst website migration I've seen go wrong, the client basically bombarded with us three days before the migration. Rule number one, probably was not a good idea and number two was that they completely changed everything. So anytime you hear someone say don't change everything all at once, they changed everything. The URL structure, general linking , user journeys, design, content, everything effectively had been changed. Even the user journey to buy or purchase something. Came to launch day and obviously one of them was untested and everything simply tanked in terms of performance, organically, rankings as well as conversion. I believe they lost £100,000 in revenue for the first month. Yeah ,it was a disaster, wasn't a great situation to be in especially for the SEO lead which was me. So, you do migration, make sure you test and plan effectively otherwise yeah, things can go horribly wrong. Second example is a beauty based website that was based in Manchester. They'd seen drops in rankings for subcats free landing pages. We kinda had a look and I found that what they'd done was acquisition react based website. Their developers had changed the navigation. They'd put all the subcategories in the Mega menu into a unclick event. Now obviously Googlebot com crawl and find things that are on unclick event, even when rendering in JavaScript. Which will basically cause all the internal linking side wire to be completely removed for those subcategory pages which obviously resulted in their drops in current pages, which were a lot in money terms as well, wasn't great. You really need to double check and test your JavaScript websites when you are making any changes to like Mega menus or internal linking structures. And those are my two examples, thank you very much.

- Hey everyone, this is Michelle Lowery, digital content editor, writer, consultant and educator. I was asked to share an SEO horror story. So a few years ago, a business wanted to redo their website and that included rewriting every page on that site. They decided they wanted to save themselves some money and hired an inexpensive writer. The result was a lot of really bad content and some questionable links on their site. That along with some other things going on on the site resulted in them being hit with a manual penalty. They hired a technical SEO consultant and they brought me on and we were able to pull them out of that penalty. But two things happened as a result of that one choice. That one choice to hire that cheaper writer. The first is the obvious, the bad content, bad links, having to deal with that penalty, lost traffic, lost clients, lost revenue. But the second is they ended up paying twice for the same content. And they paid more the second time. Overall, they paid a lot more than if they had just hired a more expensive and more experienced writer to begin with. The question is how important is your site to you? How important is your business to you? There's no SEO without content, there's nothing to optimize without content. Its the very foundation of that site, of any site. Do you wanna build your house on a shaky foundation? Or do you wanna build something strong that's gonna last? Even if it means you spend a little bit more money upfront, it'll be worth it in the long run. Good luck, bye bye.

- SEO from the green house. Many web agencies build up related websites for clients on their own systems, and leave them open so that the client can view the site. Developers will block the site and robot. txt will stop it getting indexed. That is often the cause of a new site going live and dropping out the index. Thing is, anyone can get in and view these sites before they launch and is often easy enough to find them and that's a great way for the competition to steal a match. What developers should be doing is not touching robots. txt to stop crowding the site, but to password protect the area. It's so simple every web server has this ability built in waiting to be used. I have a story that is similar but not quite the same as that. I have been working with a developer, they've built a new site for a client. Its quite complex and not as foresight so we've prepared all the redirects beforehand. New site launch and the traffic plummeted. After lots of investigation, I found the client had moved the old site to a sub domain and had not told us about that and had not password protected it. Google had got in and was crawling and indexing quite happily causing a massive duplication issue. There's nothing wrong with retaining an old system and content as a backup in case the launch goes badly wrong, is a very good idea. This event has improved my migration checklist. I now have a question on what the client is going to do with the old site at launch and that we should discuss it, just in case something like this could happen to someone else.

- Leave a comment below if you have ever made any mistakes that lost you rankings and consider subscribing because then I'll see you in the next video. Thank you for watching.

My Latest Video