It has been said that a man who represents himself has a fool for a client. The same thing could be said about a person who does their own SEO. Ok so maybe optimizing a website isn’t as complicated as practicing law but you can still mess things up if you don’t know what you’re doing.
There is no shortage of resources on the web for SEO and its hard for novices to tell which sources contain good information and which will lead them astray. Not all tactics fall squarely in the black or white hat camp. Check out these 15 ways that attorneys could inadvertently get their websites penalized with questionable search engine optimization practices.
This is a classic mistake but some still do it inadvertently. If you are at all experienced in some of the basic strategies for website optimization, you know that jamming a ton of unnecessary keywords in meta tags, alt attributes and website copy is a dead giveaway for a spammy site.
Sometimes even well-meaning individuals do this to their blog posts or articles by accident. If you are too worried about how many keywords you have on the page, it’s easy to forget that maybe you’ve added a bit too much. Re-read your copy and make sure it sounds natural.
Extra keywords are not going to make a difference. As long as your target keyword phrases are in your copy and it is clear that the writing is about those keywords, you will be fine. Matt Cutts (Google’s head of web spam) explains keyword density pretty well in the video .
We blog and add content to our websites because search engines love fresh new text and it gives us the opportunity to rank for more keywords. The drawback is that writing new copy is tedious and time consuming work. So much so that it’s tempting to take advantage of article spinning software.
In case you weren’t aware, spun content is text that has been re-written by software. Sounds great in theory but in practice content often comes out unreadable on the other end. Not only is the text unreadable by humans, it is also recognizable as spun content by search engine spiders and could get you a manual or algorithmic penalty.
Inbound links are still weighted heavily by search engines (namely Google) and Google works hard to make sure people who manipulate this ranking factor don’t get away with it for long. They outline very clearly in their webmaster guidelines what constitutes a link scheme.
Basically any time you buy links, exchange links in an excessive or unnatural manner, use automated programs to place links around the web, or launch huge article marketing campaigns that utilize exact match anchor text.
Ideally, links are supposed to happen naturally however this is an impractical expectation. Everyone who is into optimizing websites knows that links to a site give it more weight in search so they all do it. The idea is to make it look natural and/or make your inbound links useful for those who would follow them.
Keywords and text in general are the primary food for Googlebot. Without text, Google and other search engines would not be able to find and index pages.
With that in mind, some webmasters would put hidden text on their pages that are seen by search engines but not users. They do this by making the text the same color as the background of the page.
Other forms of hidden text include hiding content behind images, setting a font size to zero, or using other style code to position text off the screen. Your best bet to not get penalized for doing this is to not do it. There is really no purpose to hide text other than to deceive someone or something so if you do have a legitimate reason for hiding text, you will probably want to find an alternative method.
This can be overt or deliberate but seldom completely accidental. Sometimes webmasters will markup content that is misleading like fake reviews. Other times they may place irrelevant content within markup tags just to get it to show up in search for particular queries.
You should review a good source like schema.org to make sure you are marking up content appropriately. Google can disable rich snippet features for sites that abuse the practice.
Having thin content on your pages can lead to a penalty however there are some caveats to this. Just having a small amount of text on a page doesn’t necessarily mean your site will get penalized. If however you have a pattern of spammy tactics and your site overall doesn’t offer much value in terms of content, there is a good chance it will get penalized.
When you write copy for web pages, make sure they have at least 400 words on the page. Otherwise try and find some other page to add the content to. If you do write short amounts of text, try and compensate for it in some other way like getting the post shared a lot around the web or making it extremely useful.
This specifically refers to the placement of ads on your website. If you do not use ads on your website then you don’t have much to worry about here. Having too many ads above the fold can get you dinged by Google’s page layout algorithm.
The basic idea here is that users don’t want to see a ton of ads. Some ads are fine but people come to websites to get information or accomplish a task. If you bombard people with too many ads, they won’t click on them and will probably leave because the site appears spammy.
Building links to your site is good but you have to maintain a natural link profile. As mentioned before, the ideal scenario is that links happen naturally and not deliberately.
Obviously people don’t do this but making inbound links that appear unnatural is a good way to get your site penalized. When people build links they don’t often use exact match anchor text. Typically people use a domain name or arbitrary words like “click here”.
Using automated software to copy content from other websites for use on your own site will get you in trouble. Even copying and pasting text word-for-word from other sites can have the same effect.
Be sure that you are always using your own unique and original content. Also make sure that anyone you contract to produce content for your site is on the up and up. Many site owners swear up and down that they did nothing wrong only to find out the person working for them was up to no good.
Let’s get something straight. There is no such thing (at the time of this writing at least) as a duplicate content penalty. A site can indirectly have negative consequences from duplicated content.
When a search engine finds a page that has blocks of content already in its index, the newly discovered content may not get indexed. So if you have important pages that you want found in search and they are exactly the same as other pages on your site (or someone else’s’), you may want to think about re-writing them.
Like hidden text, hidden links are also a no-no. Spammers will do this to hide hyperlinked text among other links or among text in copy of web pages. A user might think they are clicking on one thing when in fact they are clicking on something completely different.
If you have ads that are not clearly marked as such or that flow PageRank, this could eventually get you into hot water. Google distains any sort of practice that deceives users or creates a poor experience.
Having ads on your site is fine, just make sure they are clearly marked as advertisements and that they have the rel=”nofollow” attribute.
Everything in moderation. Yes there is such a thing as having a website that is too optimized. Going overboard with any and all optimization techniques on a site will cause it to be over-optimized.
When you go to do optimization work to your site, remember not to overdo it. Follow best practices but don’t make optimizing your site your primary goal. It should be easy to navigate, relevant to what you want to rank for and easily accessible. Once you start obsessing over it too much, you start doing things that are unnatural.
No one can help that their site gets infected with malware but you can change what happens afterward. If you don’t take steps to clean malware form your site promptly, it could start affecting your rankings negatively.
Google and other search engines don’t want to show their users sites infected with viruses.
Cloaking is the practice of showing users one form of content and showing something else to search engine spiders. This is a pretty advanced black hat technique but if you are doing redirects on your site for any reason, be sure that you are sending users to legitimate pages and not trying to hide anything.
Overall you must not look at these taboo practices in isolation. Instead, look on them with a more holistic approach. For example if you had one page on a 1,000 page website that had too many target keywords on it, it’s unlikely that the site as a whole would be penalized. In general it is a pattern of bad tactics across an entire domain that gets website owners in hot water with Google.