Attorneys sometimes get bad reputations as ambulance-chasers or for using their legal knowledge to exploit people and businesses. Unfortunately, a few bad apples spoil the bunch, and nowhere is that narrative better personified than in Saul Goodman, the attorney character born on the hit series Breaking Bad who now has his own spinoff series.
If you haven’t had time to catch the show, Goodman (played by Bob Odenkirk) is a self-centered and unethical attorney who does anything he can to get his low-life clients out of trouble. He even has a hilarious website, which is no longer live, and we did an on-page analysis of it for this post.
Let’s first take a look at the robots file for the site. If you are ever having trouble with pages being indexed in search or not showing up, this is the first thing you should look at. The robots file is typically located at the root domain or www.example.com/robots.txt.
The above is what the file looks like. As it is configured for Saul’s site, all robots are disallowed from crawling the services directory. The rest of the site, however, is fair game. In general, if there are no directories that you don’t want search engines crawling, you can allow all of them. Note that bots are under no obligation to obey the robots file, but most major search companies do.
You should also look for robots meta tags in the header of webpages, indicating to bots whether the page should be crawled. There are none of those meta tags on Saul’s site. Below is an example of what the tag would look like if it were present.
Sitemaps give search engines a road map of how a site is laid out. Submitting a sitemap to search engines helps them identify pages more quickly and easily. An xml sitemap should be well-formed and should contain all the URLs (pages) of your website.
Make sure that there are no pages floating outside the site architecture and that you submit a sitemap to a webmaster tools account you have configured for your domain.
You can learn best practices and what your sitemap should look like at sitemaps.org. At first glance, it looks like Saul’s webmaster forgot to put a sitemap on the domain.
In Saul’s case, there is a lot of content being overlooked by search engines. All the video, flash banners at the top of the screen, images, and other elements are completely invisible to search. To his credit, he does have alt and title attributes filled out.
Here is the site in all its glory.
Have you ever timed yourself sitting at a light while the person in front of you doesn’t go when the light turns green? Even just a few seconds can seem like an eternity. In fact, if a person doesn’t notice the light in just a couple of seconds, we call him or her a moron.
The same is true for websites. When the pages we want to access don’t load quickly, we hit the back button and go elsewhere. Yes, visitors might return later, but you miss out on the chance to make a good first impression if your pages are loading slowly.
You can use free tools like Google’s Page Speed Insights to find out how fast your pages are and get tips on how to improve them.
Even though your robots file is in order, you should check to see how many indexed pages a site has in search. If there are none and your robots file is fine, that could be an indication of a deeper or more serious problem.
You can search for indexed pages by using the site:operator in the Google search engine. Doing this, we can see that Saul has nine indexed pages (which is pretty much all the site has), so it looks good.
So, you want see if the number of indexed pages is the same as the number you would expect to be there. You also want the content of those pages to be the same as what’s already on your website (that the pages are not outdated in Google’s index).
While search engines don’t have to use the meta description you provide in their SERPs, they often do. You should make sure that you have a relevant meta description for each page of your site. Put your target keywords for the page in there as well if applicable.
Saul has a little issue with his meta descriptions. They are all the same. Each meta description should be different depending on the page that it is associated with.
If you are using keywords, you should front-load your title tags with your primary target keyword phrase first. This helps users see it first in search. Saul has his title tags squared away with good, descriptive phrases. It doesn’t appear that he is trying to rank for any keyword terms.
Duplicate content is something you should be thinking about but maybe not for the reason you think. There are a lot of misconceptions about duplicate content, including that there is some sort of special penalty reserved for those foolish enough to have two pages with the same stuff on them.
In reality, the negative impact is indirect in that a search engine merely doesn’t know which version is the correct version. In these cases, it may neglect to include a page in its index that you might want in there.
Canonicalization helps solve that issue. If you have a custom-built solution, you can canonicalize your pages with the canonical tag. If you are using a CMS, you should look for a plugin to do the job for you.
Saul is not canonicalizing his URLs. Granted, he has a nine-page site, but what if a hardened murderer is looking for a page that has not been indexed when it actually exists? How is he supposed to get the information he needs?
This is a tricky one because having thin content doesn’t automatically mean your site isn’t going to do well. In general, you want to have a good amount of content on your pages, but if you have a popular site that doesn’t have content (i.e., a lot of other sites are linking to it), it may not be as big of an issue for you.
As evidenced by our crawl of Saul’s site, there is virtually no text-based content on his pages. Most of it is images, video, and flash. Despite those facts, the site has a PageRank of 6 and ranks very well for terms like “Saul Goodman,” “attorney from Breaking Bad,” and other similar phrases.
On-site optimization for lawyers can get a lot more in-depth then the steps you have seen here. This list is by no means complete, and an analysis may be tailored to your specific goals for optimizing your site. As for Saul, I’m sure he has the connections to get his site up to par. I doubt he’ll need it with his referral network.