SEO and the Modern Web

The purpose of a search engine has always been to connect people with the information they seek, and as time goes by, they get better at doing it. In the early days, these Web crawlers were easily fooled by authors who engaged in keyword stuffing, link farming, and a myriad of other tricks to climb the search result ladder. As time went on, search algorithms evolved and those old devices no longer worked. Unfortunately, many people still think that there is a magic lever that a Web developer can pull to push their website to the top of the result list. Well, I’m going to tell you that it’s simply not true – or at least it hasn’t been discovered yet. Your search ranking is more in the hands of your marketing team and content authors than the Web developer now. So, if you’re not a great writer, it may be time to learn how to become one – or time to hire someone who is.

Quality Keyword- (and Synonym-) Rich Content

Writing high-quality content means writing for people, not machines. Continuously repeating a word or catering to what you think Google wants to hear instead of to your actual readers is likely to have the opposite effect: a lower ranking or even penalization. The first step to good SEO is to put your readers first.

That doesn’t mean you should avoid using keywords, just that you should use them strategically. Synonyms can be helpful in that respect. It’s also worth noting that you shouldn’t use words that are over-saturated in the major search engines already. For example, I’m not expecting to appear on the first page of results for words and phrases like “SEO” or “search results.” There are too many high-ranking websites that are already targeting those words. However, I may appear in a search for something like “modern optimization for higher search ranking.” You can check the popularity of a search using Google’s keyword tool. Do some research with some keywords you would like to use to see what the competition/demand ratio is.

Links and Referrals from High Ranking Websites

There’s no question anymore that the author plays the most important role in improving search results. However, it doesn’t matter how good your content is if no one else knows about it. Search engines are basically a form of popularity contest, and if other people aren’t talking about you, then don’t expect Google to trumpet your site either. That’s where marketing comes in. You need to promote your website. The easiest way to do this is to leverage your social media networks, such as Twitter and Facebook. Your public tweets and posts get picked up by search engines and count as a vote for your website. But more importantly, they may be picked up by other website owners who decide to help spread your word by linking to you. Links from well-regarded websites in their respective fields are essential to being noticed by the major search engines.

Don’t waste time posting links to your website in blog comments or by purchasing a spot in a list of “featured” sites. Google is smart enough to recognize these areas of a Web page, and you aren’t likely to see a good return on the time and/or money spent. Focus on avenues of marketing that will truly draw attention from experts in your field, and the search engines will follow.

Useful Website Optimizations to Improve SEO

Notice that, so far, I haven’t talked about anything that requires the assistance of an SEO “expert” or Web developer – except for having a website that allows you to publish content, of course. As I said earlier, a developer is no longer the leading influence in your search ranking. It’s true that we’ve already covered the two major influences on optimization: good content and lots of inbound links from respected websites. However, that doesn’t mean there aren’t changes you can make to improve your SEO.

Make Sure Your Site Loads Quickly

A website that takes more than a few seconds to load is going to seriously jeopardize your ranking. A user isn’t likely to stick around and wait for your website when it’s just as fast to click the “Back” button and click the next result in the list. Google knows this, so why would it include a slow Web page in the top results? Here are a few ways to improve your page load speed.

Use GZIP and Other Compression Techniques When Serving Files

Make sure your server GZIP compresses your files before sending them. This ensures that they download quickly by temporarily replacing common strings of text in the file and undoing the process on the Web browser’s end. To see if the files are being compressed, use your browser’s developer tools to check the server response, and look for the following: Content-Encoding: gzip

In addition to GZIP, you should strip whitespace characters from your HTML, CSS, and JS files (otherwise known as minification). For large files, this can make a big difference.

Image compression is important as well. If you aren’t sure when to use GIF, PNG, or JPG, then you need to learn. Also, don’t send images larger than they’ll need to be. This can be especially challenging when supporting small, mobile devices and high pixel-density displays.

Reduce the Number of HTTP Requests

Browsers usually only download a small number of files synchronously (i.e. at the same time). So if your website has seven CSS files and twelve jQuery plugins, you may be holding up the display of your website. Combine these files into as few as possible, or load your JavaScript asynchronously (lazy loading).

If you have a large number of background images, consider combining them into an image sprite. There are many tutorials that explain how this can be done. If you use Compass, then you can automate the sprite creation process.

Optimize Server-side Code

Slow loading pages aren’t always due to network-related activity and file sizes. If your server-side code is doing a lot of unnecessary work, then it could take awhile before the files even get sent off. The test-driven development (TDD) process is a great way to catch code that isn’t up to snuff.

Create Clean, Semantic HTML

If you use appropriate HTML tags to define your content, it will be easier for search engines to categorize your site and find relevant information. New HTML5 elements, such as main, article, and aside, are much more descriptive than generic div tags. Clean, well organized code means that Web spiders won’t have a hard time finding the “meat” of your content, and your file size will probably benefit as well.

Create a Google+ Account and Make Use of Rich Snippets

Rich snippets are a way of categorizing your content and providing short bits of useful info along with your search result description. For example, if your content is a review of a local restaurant, you can use a rich snippet to display a star rating with your search result. Types of rich snippets currently include:

While not a standard rich snippet, Google offers support for an author snippet that will cause the content author’s picture and number of people that have the author in their Google+ circles to appear next to the search result. This draws attention and will increase your click-through rate. It also requires minimal effort to set up.

Canonical URLs and 301 Redirects Ensure the Right URL is Indexed

Most websites use SEO-friendly URLs instead of complex query strings with ID’s and other variables. But they often forget that the page will be accessible through both the short, nice URL and the long, ugly one. If a search engine finds both URLs, it looks like you have duplicate content on your site. Most of the time, only one URL will appear in the result listing, but worst case, you may suffer a penalty if Google thinks you’re trying to game the system.

Canonical URLs are supported by all the big search players now, and they’re basically just a <link> tag that tells crawlers where to find the primary source of information (i.e. the friendly URL). Be careful though; if you don’t know what you’re doing, you could destroy your search ranking. Imagine a case where your CMS accidentally adds canonical link tags pointing to the wrong URL. You might not notice it without checking the HTML source code, but it would wreak havoc on your page rank. A sitemap.xml file is also important. Google, and probably others, use this file when determining the preferred URL to a page.

If canonical URLs aren’t a good fit, be sure to use 301 redirects to keep users and spiders off the wrong URL. Most public websites should have a 301 redirect for the “www” subdomain, either to it or from it – it’s up to you.

While we’re talking about redirects and canonicalization, it’s worth mentioning that a high number of broken links isn’t good for SEO either. Regularly check your pages for links that no longer work.

Shoot for Long Page Visits and Low Bounce Rates

Traffic tools, such as Google Analytics, provide a wealth of information about your site visitors. If the average time spent on your website is only a few seconds and you have a high bounce rate, it’s a sign that people aren’t finding what they want from your site. Take a closer look at your content and see how you can revise it to target the people that would find it useful.

Only One Meta Tag Still Applies to SEO

Okay, that heading may be exaggerating a bit, but only a little. The meta description tag is probably the most important of all the meta tags, and it’s still not as important as the other items I’ve listed. If your SEO expert is telling you that you need to optimize your meta tags, he’s probably behind the times.

However, the meta description tag will sometimes be used in the actual search result summary for your page – but only if Google couldn’t find something better in your content itself. The reason meta tags are large ignored by search engines is because they were abused so often by developers trying to get a leg up in the results. So now, less attention is paid to content that isn’t actually visible on the page. Still, make sure to include a description – it may give your page one more opportunity to reel in a reader.


In summary, in order to perform well in search listings, you need great content and lots of inbound links from great websites. These are the two most important things to SEO, but don’t forget that there is still room for a Web developer to make optimizations – just don’t expect him or her to bring your site to the top search result in Google with a few code changes.

Using TranslateX with EPiServer 6 R2

In a recent project, I had to install the TranslateX module into an EPiServer 6R2 site. TranslateX is an open source EPiServer module that allows you to export pages in XLIFF format for translation services. The problem is that the original module was created for EPiServer 4 and has been since updated to work with version 6.0, but I was unable to find any resources or help to get it installed in 6R2 or 7. After some trial and error, I came up with these steps.

  1. Download EPiServer CMS 6.0 from EPiServer World and install it. Don’t worry — even though it’s an older version, it shouldn’t cause any problems with your R2 installation.
  2. Download the install package for the 6.0 version of the module.
  3. Run EPiServer Deployment Center (located at C:\Program Files (x86)\EPiServer\Shared\Install\EPiServerInstall.exe by default).
  4. On the All Actions tab, run Installed Products > EPiServer CMS > Version 6.0.x > Modules > EPiServer.Research.TranslateX.Installer.6.
  5. The default options for the first step of the installer should be fine. On step 2, check the “Show All Sites” check box, and then select the site where you will install TranslateX. (Note: You will receive a compatibility warning when you check the “Show All Sites” check box, but it doesn’t appear to be a problem.)
  6. Finish the installation.

Now, you should have it successfully installed. However, there’s more to do before it’ll work. First, you will need to create a folder for TranslateX to save temporary files, and EPiServer will need write access to the folder. You can name the folder “translationtemp” or something similar.

Open your Web.config file and look in the AppSettings section for a key named “xliffworkpath”. You’ll need to update this value with the location of your “translationtemp” directory.

At this point, you can now create translation projects for your pages by navigating to a page in the CMS and clicking the “Translation” tab. But when you try to run the project (Admin Mode > Admin tab > Scheduled Jobs > Translation scheduler service > Start Manually), you’ll probably notice an error on the “History” tab and you will never receive an e-mail with the translation. That’s because you will need to tell EPiServer which languages are available for translation through an XML file.

In the root folder of your EPiServer installation, there should be a “lang” folder. Create an XML file called “translationlangs.xml” and place it in that folder. Here’s a sample of the content to include in the XML file:

<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<languages>
	<language name="English" id="en">
		<translationlanguages>
		<en>en</en>
		<en-GB>en-gb</en-GB>
		<es>es</es>
		</translationlanguages>
	</language>

	<language name="British" id="en-GB">
		<translationlanguages>
			<en>en</en>
			<en-GB>en-gb</en-GB>
			<es>es</es>
		</translationlanguages>
	</language>

	<language name="Spanish" id="es">
		<translationlanguages>
			<en>en</en>
			<en-GB>en-gb</en-GB>
			<es>es</es>
		</translationlanguages>
	</language>
</languages>

Make sure to update the XML to fit your own language needs. As long as your mail server is working, you shouldn’t need anything else. If the “History” tab on the translation scheduler service says, “OK,” after starting the service, you’re good to go.

I haven’t tried installing TranslateX in EPiServer 7, but let me know if the steps above were any help.

Simple jQuery Autosubmit Plugin

One issue that I have with ASP.NET is how often it injects inline styles and scripts into your pages. One example is the “AutoPostBack” attribute that you can add to your DropDownList and RadioButtonList server controls. When this attribute is added, an inline “onchange” event handler will be added to your markup along with some other scripts. Ideally, this functionality should be handled in an external JS file, and it’s really not that hard to do.

Here’s a quick jQuery plugin that will accomplish that without having to rely on .NET’s “AutoPostBack” feature.

/// <summary>
/// The parent form of the provided element(s) will be submitted when its/their value changes.
/// </summary>
/// <param name="options.ignore">An ignore option is available (array) to prevent the form from submitting when certain values are chosen. By default, options with an empty string value are ignored.</param>
/// <param name="options.trigger">If trigger is omitted, the parent form will be submitted. Otherwise, a click event will be triggered on the provided jQuery object.</param>
(function(factory) {
	if (typeof define === 'function' && define.amd) {
		// Register as an anonymous module
		define(['jquery'], factory);
	} else {
		// Browser globals
		factory(jQuery);
	}
}(function ($) {
	$.fn.autosubmit = function (options) {
		var settings = $.extend({
			'ignore': ['']
		}, options);

		this.change(function () {
			var $this = $(this);

			if ($.inArray($this.val(), settings.ignore) === -1) {
				if (typeof settings.trigger !== 'undefined') {
					settings.trigger.click();
				} else {
					$this.closest('form').submit();
				}
			}
		})

		return this;
	}
}));

Using the plugin is very easy. Create your DropDownList as usual, but give it a CssClass attribute so you can reference it easily:

<asp:DropDownList ID="myDropDownList" CssClass="auto-drop-down" runat="server">…</asp:DropDownList>

Include jQuery and the plugin file in your page, then write some JS similar to the following:

$('.auto-drop-down').autosubmit();

The plugin also allows you to specify certain values that will not autosubmit when chosen. For example, you may have a drop-down with a “Select an option” choice, in which case, you wouldn’t want to submit the form unless a different choice was made. Here’s how to ignore certain values:

$('.auto-drop-down').autosubmit({'ignore': ['', 'n/a', 'empty']});

One more thing to point out — be sure to include a submit button even if the autosubmit feature makes it unnecessary. If someone isn’t using JavaScript, they should still be able to use the drop-down on your page. Another line of JS for hiding the submit button is trivial:

$('#my-button').hide();