Google+

Optimizing new Blogger title tags for SEO

By default, the title tags that Blogger uses are less than ideal. Search engines put a lot of value on your title tags, so it's worth taking some time to make sure that they're set up correctly.

When I initially set up my blog, I set the title to "Young Technologies Tech Blog". Look what happened in the search results:

Blog Google Results

YUCK! There are two major problems here. The first is that the titles are not at all useful to a human. How would you expect anyone to click on titles like this?

The second problem is that my title tags don't tell Google anything interesting about my site. You want the search engines to figure out the keywords in each of your pages, and having those keywords in the title reinforces that.

In "classic" blogger, you would simply use

<br /><mainorarchivepage><$BlogTitle$></mainorarchivepage><br /><itempage><blogger><$BlogItemTitle$></blogger> - <$BlogTitle$></itempage><br /></title>

instead of:

<title><$BlogPageTitle$></title>

This would use the title of the post item as the title of the page. The problem is the "new" blogger has a completely different template system. Be sure to back up your template before editing it! Here is what I did, step-by-step:

  • Shorten your actual blog title. I was unable to figure out how to remove the blog title from each post page, without losing the post title. Unfortunately Blogger doesn't have a token for the item post title. For my blog, I changed the title to "YTechie.com", which I think is a reasonable prefix for my post page titles.* Customize the title tag in your template. Do this by editing the HTML, expand the widget templates, and put the following code in place of the existing title tag. This will allow you to have a custom title just for the front page.
<b:if cond='data:blog.pageType == "index&quot;'>
 <title>Front Page Title (change this)</title>
<b:else/>
 <title><data:blog.pageTitle/></title>
</b:if>
  • In the code above, put in the title you want for the front page. Try to keep it around or under 66 characters. For additional guidelines, consult this guide.* Since you changed the title of your blog to something shorter, that will show up in the header as well. To customize the header text, replace this:
<b:if cond='data:blog.url == data:blog.homepageUrl'>
   <data:title/>
 <b:else/>
   <a expr:href='data:blog.homepageUrl'><data:title/></a>
 </b:if>

With this:

<b:if cond='data:blog.url == data:blog.homepageUrl'>
   This is my header title (change this)
<b:else/>
   <a expr:href='data:blog.homepageUrl'>This is my header title (change this)</a>
</b:if>

Like this post? Please share it!

See a mistake? Edit this post!

Avoiding duplicate content with your site or blog

One of the most important rules in SEO (Search engine optimization) is avoiding duplicate content. Google has some information on their page about how they handle duplicate content. Unfortunately, the Googlebot is rarely smart enough to know which content is original. Google wants to avoid users that copy and/or republish someone else's work simply to get content for their site.

You also want Google find pages on your site that have substance, and that are not just a copy of content from one of your other pages.

printer

So how do you avoid it on your site? The first step is to identify potential pages that have duplicate content. It's probably happening without you even being aware of it.

Type this into Google: site: http://www.yoursite.com

I'm using blogger, and by default here are some pages that are indexed that should not be:

  • http://www.ytechie.com/2008/04/aspnet-linkbutton-and-seo.html?widgetType=BlogArchive&widgetId=BlogArchive1&action=toggle&dir=close&toggle=YEARLY-1199167200000&toggleopen=MONTHLY-1207026000000
  • http://www.ytechie.com/2008_03_01_archive.html

Now that we've identified the offending pages, we can create or modify our robots.txt file, at the root of our site.

Here is what I could add to my robots.txt to block those pages:

Disallow: /*?
Disallow: /*_archive.html$

Once you've updated your robots.txt file, you can use the Google webmaster tools to test it. For more information on how to edit your robots file, including syntax, consult Google.

There is one big problem. If you're using a service like Blogger (like this blog), you can't edit your robots file. There has been talk of adding support, but we have to deal with what is available.

The best I've been able to come up with, is adding this into the head (look for ) of my template code:

<b:if cond='data:blog.pageType == "archive"> <meta name="robots" content="noindex, nofollow" /> </b:if>

This adds a noindex and nofollow meta tag to the generated archive pages. I have not yet figured out how to remove pages that contain parameters (?param=value). If anyone has a way to do it, please let me know! I've actually been considering removing the archive widget to solve it.

Like this post? Please share it!

See a mistake? Edit this post!

RE: Switching from Linux to Windows 2008

I've gotten so much feedback from my post "I switched from Linux to Windows Server 2008" that I think I need to clarify a few things. The responses have ranged from agreement to personal attacks. I'm impressed by the emotion of the people that really believe in their operating system. Personally, I've always tried to use each tool for different situations.

Most importantly, I'd like to explain why I wrote the post. I was simply sharing my personal experience, which may or may not be your experience. I know that Linux is very popular, and my opinion is that it is a superior operating system in many ways.

Now for a little history of my experience, which may help understand my feelings.

Confused!

Years ago I was running Windows 2003 for a home server. I was frustrated by the poor performance, high resource usage at idle, and tricky to configure. I have years of Windows experience, and I could never quite get it set up the way I wanted. I also wanted to avoid the cryptic Windows licensing, and their policy of locking you into the products of their choosing.

So, I built a brand new computer, and installed Ubuntu Server 64 bit edition. I spent weeks getting it configured (not continuously of course). I kept notes, and I was pleased at the fact that I could use a single command line to install pretty much everything I needed:

Sudo apt-get update;
sudo apt-get upgrade;
sudo apt-get -y install
ssh subversion dovecot-imapd samba xinetd build-essential getmail4 p7zip-full ia32-libs mdadm bind

 

How cool is that!

I was very happy, except when things would go wrong. For example, 4GB of memory wouldn't work with Linux, but anything less would. It turns out that I had to blacklist the "intel_agp" module. I really wish I could get back the hours of my time that were wasted. Am I any better for the experience? No. Can this happen in Windows? It certainly can.

Here were the additional problems I had, which subsequently wasted additional hours of my time:

  • The software RAID array would stop working 50% of the time when rebooting. It's not fun having your heart sink when it says that there is a problem with the RAID array that all of your data is on (yes, I have off-site backups, but it's still a hassle).
  • The VMware server console would not work when I reinstalled the host OS. VMware itself worked fine, and I used the "vmrun" command over Putty to manage my virtual machines. Believe me, that is not fun.
  • Samba was ridiculously slow. Try searching for samba optimizations in Google. Why isn't the performance better out of the box?
  • Random network errors would appear on the server console. Sometimes the network would simply fail. Sometimes it took a reboot to get it working again. Simply calling "sudo restart /etc/init.d/networking" didn't work.
  • Dual monitor support. Don't even ask me how many times I've modified my xorg configuration file.

Could I have taken the dozens of hours and fixed all of these issues? Maybe. I did my due diligence and searched Google and the Ubuntu forums for help. I even posted my problems without getting answers. I'm certainly not against learning. I'm a software engineer by trade, so I would rather be spending time learning Adobe Flex, Silverlight, MVC frameworks, and new coding methodologies. Is it so bad that I want to choose what I want to learn?

I just want a solution to my problem that works with the minimum amount of hassle, so I can focus on what I enjoy.

That is my _opinion, _and you can't say it's wrong!

Recently I read quite a few articles about .NET developers switching to Windows 2008 as a workstation operating system. I took the leap and did the same. My experience has been very positive, and my opinion is that it's a great operating system. I intend to write a post specifically about that in the future.

Because of the frustrations I was feeling on my server, I decided to give Windows 2008 a try on there as well. In total, I've spent about 2 hours getting it set up, from start to finish. Some of the new setup features deserve an entire blog post (stay tuned!). The only thing I haven't set up is subversion and my off-site rsync backups. I'll be running the Subversion virtual appliance for my source code (which coincidentally, is my project).

On person said that I'm just trying to get attention. Well, it's my blog, and I admit that part of my motivation is to get readers.

Another person said that the post was an advertisement (at least I think that's what he meant). Well, to prove I'm not being paid by Microsoft, I'll admit that they have a ton of software that just plain sucks. I'm looking at you Vista.

Like this post? Please share it!

See a mistake? Edit this post!

Jason Young I'm Jason Young, software engineer. This blog contains my opinions, of which my employer - Microsoft - may not share.

@ytechieGitHubLinkedInStack OverflowPersonal VLOG