Those of us who work in the web space are used to things always being in flux. New technologies are coming into use every day. As someone involved in SEO, I bear in mind that Google makes hundreds of algorithm changes each year. It’s easy for us to follow the new thing. This can distract us from the fundamentals.
If we know what hasn’t changed in Google’s guidelines, we can keep in line with ongoing best-practices.
What has stayed the same?
You should still:
- Follow Google’s Quality Guidelines. They have not changed. That means the changes relate to items that help Google “find, index, and rank your site” but the same old tricks will get your website banned by them.
- Submit your website to Google and submit the sitemap to Search Console. Yes, despite those spam emails you get, you can do this yourself and for free.
- Create good content using the words people searching for you would use. Okay, we’ve heard this mantra before – and in many different ways – yada, yada, yada.
- Support the “If-Modified-Since” HTTP header for modified pages. To be honest, this has been on the list for a while, but I never noticed it. I guess I need to start paying better attention.
There were a few, smaller changes in the guidelines. Usually, Google strengthened their language or added some clarity.
Internal Links Are Very Important
If I were to name one underrated SEO tactic, it would be your internal links. Google seems to agree with me.
- Each page needs a text or image link. Unfortunately, and apparently, many people still build orphaned pages and expect Google to find them. Now Google says you can use something besides a text link. It now suggests an image link (explicitly with an ALT attribute) will work, too. That might not be revolutionary, but what follows might be: the link should be from a page “that is relevant to the target page.” Not all links are created equal.
- Limit the number of links on a page. If links are from your own site, as long as they are relevant, let’s build more! Well, Google has given us some limit. They’ve always said to keep it a “reasonable” number. Now they say to limit them to “a few thousand at most.” That might seem like a lot, but have you looked at your HTML lately? I’ve seen websites whose navigation far exceeds this per page.
- Broken links. If you want some (of thousands) of those links to count, you should also make sure those links are still active. Previously, Google told us to “check for broken links.” Now it’s a little stronger: “ensure that all links go to live web pages.” Better double check those links!
- Website Hierarchy. Don’t just throw a few thousand links on your page and call it a day. Now your linking structure should be a fundamental part of your website’s design as part of a “conceptual page hierarchy.”
- Many people use sitemaps (of the HTML variety) to help with internal links. Now they’re downplaying that page, as an afterthought. They’ve actually removed that recommendation and suggested we rely on an XML sitemap. Depending on how you might read the new Guidelines, a human-readable XML sitemap might be enough.
- URL Parameters. Parameters in URLs can make a huge mess, especially in ecommerce websites. Previously Google warned developers: “If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.” In the latest guidelines, this suggestion has been removed, unfortunately. Google still asks you allow search engine bots to crawl your site without session IDs, parameters or other things appended to the URL. They’re not saying not to use analytics or ecommerce platforms that do that (although I would); they’re telling you not to let the Google bot be sent this extra URL information.
- Content Management Systems. There are numerous CMSes out there with which you can build a website. All of them claim to be “SEO-friendly” (they couldn’t sell any if they didn’t make this claim). Unfortunately many are not. Don’t take my word for it, listen to Google. They’ve been warning you that, if you use a CMS, to “make sure that the system creates pages and links that search engines can crawl.” What’s new? Google actually calls out two different CMSes- WordPress and Wix. Personally, I think WordPress already does this, really well. Unfortunately, the last time I looked at Wix (and, I’ll admit, it’s been a while), they did not.
Your Content on Each Page is Still Important
Google, ultimately, is a computer program. Therefore, you should explicitly state what you are about on your pages in text. This is still important.
- Text vs. Images. Google says, “Try to use text instead of images to display important names, content, or links…. If you must use images for textual content, consider using the ALT attribute to include a few words of descriptive text.” One thing they just removed from this statement: “The Google crawler doesn’t recognize text contained in the images.” Could all that captcha data be paying off for Google? I wouldn’t count on it, and I’d take their first recommendation (but still, it might be worth a test or two).
- Title and ALT tags. Previously, Google recommended we write title tags and ALT tags that are “descriptive and accurate.” Now they add that they should be “specific,” too. I guess they’re tired of people cramming the same keyword in every title tag on their site with the hopes of “ranking” for that phrase. That and tired of every alt tag on a page including your favorite “keyword.” Stop that, already!
- Images, Video and Structured Data. Google makes stronger recommendations when it comes to images, videos, and structured data. Previously they told us to simply “review our recommendations” but now they tell us to “follow” them. Okay, Google, I will.
Don’t Forget Your Robots.txt File
Not every site needs a robots.txt file but if you use one, make sure it’s setup correctly.
- txt file. There were two directions for webmasters to follow about their robots.txt file. For one, use it but don’t prevent the Googlebot from accessing your site. Secondly, prevent the Googlebot from accessing auto-generated pages (such as search results). Now they suggest you use the robots.txt to prevent websites from reaching auto-generated pages, so they don’t get caught in “infinite spaces.” They also removed the warning against accidentally blocking the Googlebot. Perhaps they’re just tired of so many people blocking them and leaving it up to you to figure this one out, yourself.
- Crawling Scripts. A few months ago a bunch of webmasters got notifications in their inboxes, telling them to stop preventing Google from reaching their scripts (JS and CSS). This was, I think, so Google could tell if a website is mobile friendly. The guidelines reflected this, asking for us to give Google access to “all assets” like this. Now, Google has modified their guidelines to “all assets that would significantly affect page rendering to be crawled… that affect the understanding of the pages.” So, not everything.
Links from Other Sites
Links are still one of the strongest ranking factors. Be sure to keep up with their guidelines when you build those links or you’ll get slapped.
- Awareness of your website. This is the strangest part of the Guidelines: “Make sure all the sites that should know about your pages are aware your site is online.” Is this a strange way of saying “build links” without saying “build links”? I think so. The point that I think Google is trying to get to here is that you should make sure (now) “any” of the sites linking to you are linking to active pages. That’s good advice! If you can’t, make sure you use a 301 redirect (my advice, not explicitly Google’s).
- The Google organic search team continues to live in a love-hate relationship with ads. While they must realize that ads pay their salaries, they do know that ads can lead to less trustworthy websites. That’s why Google has always asked us to “make reasonable efforts to ensure that advertisements do not affect search engine rankings.” Previously Google gave us an example: “Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.” Apparently the AdSense team got their way a little more in this round of the guidelines. That sentence was removed. However, Google’s vigilance against manipulative link building continues into the new guidelines: “use robots.txt or rel=”nofollow” to prevent advertisement links from being followed by a crawler.”
Some Technical Guidelines
- Browser-Compatibility. To be honest, I’ve never really worried about how a website performed in the different browsers. It seems to me that they’re all converging on a similar standard and that it’s rare for any website to not work the same on most browsers- this is especially true now that IE is dead! Now that the conventions are converging, apparently Google is getting stricter about this. Previously they just told us to “test your site” for cross-browser compatibility. Now they are telling us to “ensure” the same experience across browsers. Don’t forget mobile devices and tablets, too (keep reading)!
- Page Loading Times. We’ve all known that page speed is important not only to Google but also to people. Google continues to promote this in their new guidelines. They’ve removed some of the phony-baloney idealism claiming to make the web better. Now they just tell you to make sure to keep things running quickly. They even recommend a couple of tools to help you with this and removed another from their list of recommended solutions. YSlow is no longer explicitly mentioned as a recommendation. That’s what Yahoo! gets for stealing some of their market share!
So, most things that used to help Google “find, index and rank” your web pages are still in effect. Sure, Google changes things often, but there are several basic things that will always be the same.