Optimizing Web Applications for Search Engines

Web applications are very useful and mostly easy to implement into a website. The most popular web applications are content management systems (e.g. Joomla), blogs (e.g. Wordpress), forums ( e.g. phpBB) and online shops (e.g. osCommerce). Search engine algorithms can change and therfore it wouldn't be sensible to publish optimized applications that could soon get outdated. Programmers of such applications rightly focus on the usefulness of their application but that leaves webmasters or SEOs respectively with the challange to prepare and optimize a web application for search engines.

There are multiple challanges for a SEO optimizing web applications. The first is to gain sufficient understanding of the system to know where to apply changes. This requires good knowledge about programming and databases. Unerstanding a system can take a lot of time. That's why SEOs usually specialze on a few systems. What's more, the SEO has to follow improvement and security updates of the application.

Soon the question arises how far to go with optimization. The system should not be changed in a way that updates would cause high workload. There is also the responsibility to preserve user experience which doesn't make it easier to find the right optimizaton level. Optimization levels range from basic crawling optimization to achieving ambitous rankings.

SEOs try to use the same optimization techniques for applications as they would on "simple" HTML-collections. Hence, only special aspects of applications optimization seem noteworthy. One important aspect is that many applications dynamicly encode data into URLs. SEOs try to change a system to make these dynamic URLs look as if they where static. Sometimes that can mean a SEO has to change the way a system manages sessions, namely to cookie based session handling.

Static content is important for search engines as it increases the probability of accurate categorization. Nevertheless, the SEO has to make sure the system doesn't produce lot's of pages with very similar content. Dublicate content problems are nasty as pages get randomly pulled of the index. Popular candidates for such pages are those that return messages such as "sorry, no entry at this time". Search engines should be able to indentify previously created pages as deleted if the content is no longer available. A SEO can also think of implementing dynamic site structure adaptation if that seems appropriate.

Finally, there's the aspect of altering the database structure. Web applications often lack data containers that allow to individualize and better optimize pages. A SEO implements these data containers and also changes the respective administration pages.