Avoiding certain SEO mistakes can really save you a lot of hassle, time and money. Of course there are plenty of SEO strategies to implement for success but if you make one wrong move, you could end up seriously hurting all of the SEO work you’ve done. We’ve compiled the following list of SEO mistakes to avoid in 2013.
There was a time when Flash websites and Flash intros were all the hype (i.e. 2001) and, if you didn’t have one, nobody would look twice at your product or service. That time is long gone and if you haven’t done it yet, 2013 is the year to finally move away from Flash.
Flash websites take a long time to load and they do not provide indexable text to search engines. This means that organizations with Flash websites are going to be considerably more difficult to find in search engines and, even when their site is found, the organization may lose visitors because their site takes too long to load.
Flash has been dead for a long time but, based on the number of flash websites still around, this item needs to be #1 on this list. W3techs.com reports that Flash websites still make up an unexpected over 15% of all websites on the internet. Move away from Flash you’ll see the difference!
Content is king. It always has been and always will be. In fact, content has become even more important in recent years due to improvements in search engines accurately ranking web pages based on the value they provide to the end user.
Because so many people have realized how important content is in recent years, there has been a mad dash of people uploading any content regardless of whether or not it provides value to their users. This phenomenon has created copyright problems, duplicate content issues and problems with websites having useless, boring and time-wasting content. It also led to the creation of “content farms” which are discussed in depth in section 6.
That being said, quality content is by far the most important component of your web presence. It is much better to publish one high quality article per week than it is to publish 5 low quality articles.
If you concentrate on high quality content, both people and search engines will like your site more. The higher the quality of content you have, the more you will appear in search engines, the more people will visit your site and the more they will share your content with others. The results in a cycle that leads to lots and lots of happy visitors. Bottom line: quality content is the most organic way to amplify your SEO presence.
On-page optimization consists of structuring your website in a way that is easily read and index by bots like search engine crawlers. This includes making sure you’re making use of header tags (H1, H2 and H3 tags), title tags, including meta information and providing search engines with a properly structured sitemap.
Proper on-page optimization coupled with quality content results in a site that is very difficult to compete with. Most on-page optimization techniques are tasks that can be done once and will then trickle down through your entire site. This is not to say that on-page optimization is not something that requires maintenance and improvement. As search engines update their algorithms, modifications to your website’s on-page optimization will be needed.
A good example of this the schema.org initiative launched with the support of Bing, Yahoo and Google in 2011. This initiative is meant to better organize the web and websites that include the schema markup tags will be more easily indexed and categorized by search engines. It’s one of the first steps the internet as a collective is taking right now to move toward a more semantic web (although, eventually, we’ll probably move into much more robust ontologies for classifying content).
In order to ensure that your organization is properly implementing on-page optimization techniques, it may be beneficial to hire an outside agency to provide a report on your current on-page optimization implementation and recommendations for improvements. If you’re interested, SEOcial can help!
Often times, when you hear people discussing search engine optimization, all you hear them talk about is links, links and more links. Although quality links leading to your website will certainly help your rankings in search engines, building links shouldn’t be a substitute for providing quality content. Instead, links should be viewed as complementary elements to your site’s content. The goal of your online presence shouldn’t just be to rank for keywords relevant to you, but rather to provide quality content that makes the internet a better place for users.
By this point, I’m sure you’ve heard of the Penguin and Panda algorithm updates to the Google search engine. These updates penalized certain sites for having unnatural link profiles and links from article directories. One must tread very lightly when building links these days and a great way to gain high quality links is to do what is called “link baiting”. Link baiting is done by producing high quality content and enticing your readers to link to you simply out of a desire to share the content that your organization offers.
The internet is a place to find and share information. One of the main ways this is done is by linking to others’ content that you believe prides value — it’s called “the web” because it consists of web pages interwoven with massive quantities of hyperlinks.
With that in mind, it is quite obvious that the internet would not be nearly as valuable as it is without the use of links. It was mentioned above that certain links may be bad for you site. On the other hand, if the content of your site is providing value to other websites on the internet, there is nothing wrong with sharing your information with them and politely asking if they’d like to link to it, after all, Google’s search engine was originally solely based on link popularity: that was literally all it consisted of.
If you never want your website to show up in Google search results again, then participating in content farms and article directories is the way to go! Content farms are massive repositories that republish junk content over and over again with little to no value to end users.
Historically, an online publisher would publish an article on their own site and then send the same article to a content directory for it to be republished. Sometimes they’d “spin” the article first, by running it through software that mixed up sentence order and substituted some words for synonyms. This method of publishing content is known as syndication. The hope was that this would create more exposure for their article and hopefully drive some traffic to their site.
The problem is that this created a duplicate content issue for search engines. As the Google bots were scouring the web, they would find the exact same article published in two different locations. Initially, this is wasn’t really a big deal. When Google realized that people were taking advantage of content farms and artificially improving their rankings in search engines, Google released the Panda and Penguin updates which penalized sites flagged for duplicate content violations.
What is the moral of the story? Avoid content farms and article directories at all costs. A directory that lists the name of your company and a link to your site is fine. A directory that republishes an article already published on your site however, will really hurt your search engine rankings. If you find yourself in a position where you’ve already been hurt, you can use the Link Disavow option in Google Webmaster Tools to tell Google that you don’t endorse a particular link coming to your site.
Although we discussed how duplicate content can hurt your search engine ranking when it comes to article directories and content farms, it is important to understand that any form of duplicate content will hurt your site.
This means that, no, your friend cannot publish an article you’ve already published on your site even if he / she has your permission unless you want both of your sites’ rankings to be penalized.
It also means that if you have a development server or a sandbox environment in which you test potential new features for your live site, you need to make sure that it is not being indexed by search engines. Trust me, I’ve seen sites penalized big time for duplicate content that is coming from their own test site because they forgot to set the site to not be indexed in search engines. Don’t forget that duplicate content doesn’t only apply to content syndicated to other websites. You shouldn’t duplicate content on the main content areas of different pages of your own site either. Remember, if you find yourself facing duplicate content issues, the canonical tag or a simple attribution link from the copied content to the original content will solve most problems.
Over-optimization is a term that encompasses “overdoing” SEO techniques. The most common form of over optimization is that of “keyword stuffing”. Historically, if an organization wanted to rank for a certain keyword in search engines, they would just repeatedly list that keyword all over their website – even if it made no sense when a human being was viewing the site. Search engines quickly caught on to this black hat SEO technique and penalized those implementing it.
Another over-optimization technique is making your readers go to a new page on your site for every different sentence or paragraph of a particular article. Webmasters do this so that they receive more “pageviews” on their site and can therefore charge advertisers more money. You can use this technique if your article is particularly long and you just need to break up for your readers. For instance, this article could probably be safely broken up into 12 different pages. If this article was only 12 sentences long however, it would not make sense to create a separate page for each sentence.
One of the most common forms of over-optimization out there is somethinge we already discussed: unnatural link profiles. Receiving large quantities or links from the same source, completely irrelevant sites or using the same anchor text across all your inbound links will certainly make Google suspicious!
People often view social media and SEO as completely separate entities. What people don’t realize however, is that social media can play a very large role in determining how your website ranks in SERPs. It was the subject of much debate for a long time amongst SEO experts, but in 2011 Google and Bing finally admited that social signals influence search directly (even though links from social networks are “nofollow”). Based on the speed at which social media platforms are expanding and being considered as more viable sources of information, there is no doubt that search engines will continue to take social signals into account when ranking websites in search results.
When implementing this aspect of SEO, people generally just think they need to share everything they publish on their social networks. While it is still a good idea to do this, it also important to make sure that you’re providing your users with the ability to easily share content from your site. This means that you should have clearly visible social sharing buttons alongside all of your content. By enticing users to share your content, you will receive direct traffic from social networks and you will also improve your rankings in search engines.
As you may be beginning to realize, there are a lot of individual aspects of SEO that all add up to create the big picture. As you proceed with your SEO campaign, it is important to utilize all of the different SEO techniques and not put all of your eggs in one basket.
It may be easiest to think about the different aspects of SEO as having synergistic relationships. For example, concentrating on the social sharing aspect of SEO will increase the number of eyes that see your content. The more people that view your content, the more opportunities you have for people linking to your content. The more people that link to your content, the more other people will see it and share it on social networks. The whole process is cyclical and by spending some of your time on all aspects of SEO, you greatly increase your chances of success.
Some aspects of SEO require more maintenance than others. One thing’s for certain however, the process as a whole requires consistency and work. On-page optimization is probably the easiest aspect of SEO to set and forget. As search engine algorithm updates are released however, you will have to make some changes to your on-page optimization.
Producing quality content and building links are the aspects of SEO that require the most maintenance and time. Consistency is very important when it comes to both, so make sure that you publish on a regular basis. You need to both be always be on the prowl for link oppurtunities and ideas as well as ensure that you have a plan to produce content regularly. Having a content relase schedule and scouting for a certain number of link oppurtunities each week is a good way to keep your SEO profile dynamic.
Just because a particular SEO technique works today, does not mean that it will work tomorrow. This is particularly true of techniques that don’t benefit the quality of search results in the long run. Search engines are constantly trying to update their services to provide the most value to the end user. If you find a way to beat the system and get your website to the top of search results without actually providing value, you should be prepared for adverse effects when Google or Bing decides to patch the workaround you’re using.
A strong strategy we use to predictively anticipate future SEO updates here at SEOcial is to frequently search the US Patent and Trademark office for new search-related utility patents (or acquisitions of patents) from Google and imagine how Google might incorporate a new search methodology in their algorithms.