I’ve recently been working with an SEO firm to improve our “keyword density”, structure and several other things on our public website. In their long list of recommendations was the task of producing nice pretty urls with relevant keywords, dashes instead of underscores, and so on, easily said, not so easily executed or so I thought. Our architecture in a nutshell is Apache web servers, fronting WebSphere application servers, running a Struts-based web application. Now if you know Struts, 9 times out of 10 your url’s are ugly, because a bunch of programmers didn’t care at all when they developed your application about the impact the urls would have on natural search and the framework developers pretty much left you with a bunch of “do.do”. Very quickly the SEO firm was recommending 70+ rewrite rules on the Apache server to resolve to the urls in the application and then custom work for each individual url to rewrite it to the friendly url, so that when Googlebot crawls the site it would traverse these friendly urls. I cringed at the thought of this suggestion, not only is this not maintainable, but when I run a local server I can’t use the rewritten urls, as my development environment doesn’t have a full blown http server with rewrite capabilities. I knew there had to be a better solution, I just wasn’t sure what it was.

Continue reading

Author's picture

GreatWebGuy

Code jockey, cloud native enthusiast, technologist, car nut

Developer

West Palm Beach, FL