RIP - Lessons Learned

In the past, my colleagues and I have had "big ideas" that we tried to turn into reality. Unfortunately, we never found that one idea that took 5 minutes to create, and made millions overnight. I think that's everyone's goal, but is rarely reality.

I've decided to shut down two big sites that I thought had a great shot at success, but never quite reached critical mass. As a tribute to these sites, I'm going to dedicate a post to each, talking about their history, and the lessons that were learned. This post will cover, and the next post will talk about Logo

A few years ago, people started realizing the power of "backlinks", or other links that pointed to your page. Sites started taking advantage of this fact by doing link exchanges. Basically, you would offer to link to another website in exchange for a link from their site back to yours. Of course this was a tedious job, and people were looking for a quick way out.

Along came sites like PowerLinks (which later renamed to LinksMaster). Basically, they link every site in their system to your site, and in exchange, you link back to every other site. The problem then, and even more so today, is that Google is known to be able to detect reciprocal links. To make matters worse, Google has been known to penalize for this behavior. The goal of Google is to determine relevance by examining quality, organic links. PowerLinks would instantly give you 10,000 links, not exactly natural. Currently, their site seems to have degraded into a pyramid scheme, and is still plagued with the problem of being a really bad idea.

A few colleagues and I knew there had to be a better way. The result was Basically, it worked like PowerLinks, except that it avoided reciprocal links. Any site that you would link to would not link back to you. The other major feature was that instead of adding thousands of links instantly, they would grow naturally over time. Basically, we were mimicking what would happen if you went out and got the links yourself, but we were making it completely automated.

Our concept was sound, and I personally saw the backlinks help boost pages in the search engines. Even with over a thousand users at its peak, it never reached critical mass. The growth rate was there, but only with constant, expensive advertising. The system relied on mass numbers of users to join to keep making relevant links.

An interesting problem we ran into with a lot of users that visited the site was that it would be blank in their browser. Norton Internet Security didn't like our site, because they decided to block any site that ended in "LinkExchange". LinkExchange, was a website from the early web days that allowed you to put a banner ad on your website and receive reciprocal banner ad on another site pointing back to yours. Norton was apparently blocking it because they consider it spam. We contacted Symantec, and they refused to change the behavior of their software.

Lessons Learned

  • Critical mass is vital to the success of your site. This may take a boatload of money to buy advertising, an existing army of followers, or it might take a concept so profound or new that it spreads virally. Once we quit advertising, the site began to die a slow death.
  • The better product doesn't always win. Once people understood how our system worked, they were sold. The problem is that in the 10 seconds we had to sell the service, we failed. There are many other factors more important than having the best product. Part of the problem is that it isn't always clear which product is better. To sell a service, I recommend really examining how you'll be marketing it, especially in our current era of social networking.
  • Simple is better. We live in the real world, where we just want things done without any work. Many users simply didn't understand the advantages of the site, others had a hard time using it. I learned that if something is going to be successful, it needs to be so easy that I don't have to think about it.
  • How smart are your users? For many services, the reason users use it is because it solves a problem they don't understand. Our users typically didn't understand SEO, but knew it was important. They didn't understand the benefits of our system, and wanted a quick fix (even though one didn't exist).
  • Users want instant gratification. PowerLinks had the advantages of getting thousands of links instantly. Once the user saw that happen, they would sleep better at night. With our service, users didn't feel satisfied when there were days that they didn't get any backlinks (remember, the site was designed to mimic human behavior). If you can charge money, and instantly give users something of value, they will be happy.
  • Choose your enemies wisely. In effect, we were battling against Google. We decided to outsmart their system. The result was that we had to avoid doing what we thought they would detect and penalize. Think about the future, and understand if you have a realistic shot of winning.
  • Software is hard. It seems like I have to relearn this on every project. You're not going to write this stuff overnight. Doing it well is going to take time, especially if you also have a full-time job. You need to determine for yourself if it's even worth taking away from your personal time, and potentially family time for this venture. Conclusion

Even though the site never made millions, I don't regret the time spent on it. It certainly made me a better developer and architect, and the lessons learned will always be with me. The amount that I learned was something I could have never learned without having a vested interest and passion for its success.

Like this post? Please share it!

See a mistake? Edit this post!

Secure XAMPP by only allowing local access

This site and a couple others are served up on a dedicated server. To make it easy to set up Wordpress, I'm using XAMPP. In this post, I'll give a quick overview of XAMPP, and then also show you how to secure it so that administrative utilities are only available locally.

XAMPP Super-Quick Overview

XAMPP is basically a quick way of setting up MySQL, PHP, Perl, and Apache. You can download it, extract it, and run it from any location. If you're not experienced Apache/PHP world, this is the easiest way to get something working ASAP. In the next image, you'll see the directory structure under your XAMPP directory:


Here are the key folders you would probably need to worry about:

  • apache - Contains the installation of Apache.
  • htdocs - Contains the folder that is served up by Apache. If you want to install a web application such as Wordpress, you probably want it in here.
  • mysql - Contains the installation of MySql.

Now, if you simply want to install Wordpress on XAMPP, I'm not going to write yet another tutorial. There are plenty out there, most with screenshots of step-by-step instructions, just Google for them.

Securing XAMPP

Once I configured XAMPP, I stupidly assumed that the utility applications like phpmyadmin would not be publicly available. I was very wrong, and was warned before anyone decided to do something bad.

Most of the instructions I found through Google for securing the utility paths seemed kind of weak to me. They basically work by securing those paths with a password. I'm a little paranoid, so I don't want those paths remotely accessible at all.

The first thing I did was lock down Apache security so that it's very restrictive by default (apache/httpd.conf):

<Directory />
    Options Indexes FollowSymLinks Includes ExecCGI
    AllowOverride All
    Order deny,allow
    Deny from all

Make sure that you don't have any other directory directives in your configuration file that may override this.

Next, in the .htaccess file for each Wordpress installation, I added this line: "Allow from all". This basically tells Apache that this folder is safe to serve up to everyone.

Now, the problem is that XAMPP has a configuration file that overrides the utility paths and allows access for anyone. To fix this, perform a search and replace in the (in Apache/Conf/Extra) "httpd-xampp.conf" file to change "Allow from all" to "Allow from". Now, all of the XAMPP directories will only be served locally.


My background is certainly not Apache/PHP, but I'm still learning. If I made any mistakes in my configuration, please leave a comment or send me an email.

Like this post? Please share it!

See a mistake? Edit this post!

ASP.NET MVC Pro's and Con's

In our current iteration of improving our software development strategy, ASP.NET WebForms simply doesn't fit in with the new demands of being unit testable and flexible. It comes as no surprise that ASP.NET MVC has been getting all the headlines lately. Like many others, I've gotten the itch to try it out. In this post I'll talk about what I believe are the main pro's and con's of the new style for building web applications.


PRO - No ViewState or "surprise crap"

In traditional ASP.NET WebForms, the luxury of pretending to behave like a Windows Form comes at a price. The ViewState is a reliable way of storing all of the state information for the form. Unfortunately, due to the limitations of the web, this data needs to be a giant string inside of a hidden form field. This ends up adding a substantial number of bytes to the page, making it slower and requiring extra bandwidth. Of course the ViewState is controllable, much like the dinosaurs in Jurassic Park.

Not only is the ViewState gone, but "mystery" generated HTML is also gone. You have strict control over the HTML. This gives you great power, but with great power comes great responsibility. Use it wisely, and you will have elegant XHTML output with no surprises. You need to really know your HTML, which in today's web world is a prerequisite anyway.

PRO - Faster server-side

It's hard to get any real performance data about MVC, but it's been suggested that it's potentially 8000 times faster. Supposedly it's due to less processing since it simply processes a "template" instead of having to build a complicated control tree. Even if it's twice as fast, or even marginally faster, that would be significant for popular sites, or give at least a slight boost to smaller sites.

One thing that I found much easier to do with MVC was to have multiple versions of a page that displayed the same data, but in a slightly different format. For example, on my RSS package tracking website, you can look at your tracking information in a full-featured desktop browser, a mobile browser, or an RSS reader. The data being displayed is always the same, but the actual rendered output was different. If I later want to make an iPhone specific version for example, I would simply make a new view, and use an existing controller action.

PRO - Unit testable

One of the biggest challenges with WebForms was that testing was difficult. Not only was the actual output not easy to test, but the code-behind would tend to be a place that would contain important code that would never get unit tested. With both MVC and WebForms, it's best to keep your logic away from the page, but it's not always easy or ideal. MVC makes it simple to unit test the logic that is specific to a page. To do so, you simply test the actions in your controllers, which are regular, easy to test classes.

CON - Difficult to convert an existing site

MVC is not only a framework in this case, but a style. It is possible to convert specific pages as needed, but the cost is high. The problem is that MVC is radically different, so the benefit of converting probably isn't worth it for most of your existing pages.

If you decide to convert your site to MVC, you may also run into issues trying to maintain existing page addresses. The specific issue I've ran into is that routes cannot have a trailing slash. If you want to maintain existing URL's that have trailing slashes, there is no way to have the built-in routing generate URL's with a trailing slash. You may end up losing one of the big advantages that MVC has to offer.

CON - Not best for SEO out of the box

I've mentioned some of the SEO issues before, and all but the trailing slash issue have a reasonable workaround. The routing engine likes to allow multiple addresses to render the same page, instead of enforcing a single address for each page. Luckily, as Scott Hanselman mentions, you can use a URL rewrite engine to bend it to your will. I highly recommend spending some time writing intelligent rules that perform the necessary 301 redirects, because you don't want to take chances with SEO (Search Engine Optimization).

CON - Challenges if you're not running IIS7

It's clear that the last couple of versions of IIS have been major improvements over their predecessors. IIS7 takes .NET integration to an entirely new level. There is already a good page that covers the challenges you'll face if you're not running IIS6. I'll just list them here for brevity:

  • .NET needs to handle all page requests to ensure that the MVC pages will be processed. This leads to bad performance of static files.
  • HTTP Compression through IIS6 doesn't work, because the MVC pages are dynamic.
  • The homepage gives a 404 when hosted on the root of a domain.


If I needed to build a new site from scratch, and was able to use IIS7, it would be extremely likely that I would choose ASP.NET MVC. It's a joy to work with (possibly because it's "new"), and just makes sense. If I needed to work with an existing site, I would certainly have to consider the pro's and con's I mentioned above. ASP.NET MVC gives us an amazing new tool in our huge Microsoft toolbox.

Like this post? Please share it!

See a mistake? Edit this post!

Jason Young I'm Jason Young, software engineer. This blog contains my opinions, of which my employer - Microsoft - may not share.

@ytechieGitHubLinkedInStack OverflowPersonal VLOG