Xenu's Link Sleuth strives to find broken links before site visitors do.
There's not much most people can do when they are frustrated with a website that has broken hyperlinks that only lead to an error message instead of the content they seek. A software developer by trade, Tilman Hausherr shares the frustration associated with bad links. He was equally displeased with the service provided by a software product that promised to identify those broken links so they could be repaired. One of the most important duties involved in maintaining websites is finding (and subsequently repairing) broken hyperlinks before visitors do. Hausherr felt he could make that job easier for himself and others.
In 1997, when Hausherr was checking his personal website for broken links, he found that the WebAnalyzer software he was using was ineffective. It was that unhappiness with the product that led him to create Xenu's Link Sleuth, a spidering software that checks sites for those broken links.
"I just thought that I could [create] a simple, easy-to-use product with fewer bugs than the existing product; and at the same time, learn about internet programming," says Hausherr, who is based in Berlin. Hausherr says the main difference between Link Sleuth and similar products and services in the market is the price. Link Sleuth is free to users. WebAnalyzer is no longer offered. However, another competitor--NetMechanic--still exists.
The technology used to support Link Sleuth is Visual C++ with MFC (Microsoft Foundation Classes). Its main function is to check websites for broken links. According to Hausherr, link verification is done on "normal" links, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts, and Java applets.
To begin using Link Sleuth, users must download a zip file and then unzip it into a directory. To check a site's links, users just enter the web address or click the "browse" button to check links on a local HTML file. A user can select particular URLs to be checked. When a root URL is selected, Link Sleuth can check local pages linked from it.
Among Link Sleuth's features are a simple user interface and the ability to recheck broken links. It supports SSL websites (https://) and can conduct partial testing of FTP and gopher sites. Link Sleuth detects and reports redirected URLs.
It also displays a continuously updated list of URLs that can be sorted by a variety of criteria. For instance, it can display broken links by link or by page. It will also show redirected URLs. Hausherr says that reports, which show which links are broken, can be produced at any time.
The process of checking the links is facilitated by preemptive multithreading, which enables the Link Sleuth software to collect several web-pages simultaneously. The maximum count of threads is automatically set at 30, but it can be changed to any number from one to 100.
It's a process that has attracted attention from a wide range of users. Hausherr says that Link Sleuth's users come from various business segments and include individuals who monitor their private websites as well as webmasters who are charged with checking links on their organizations' intranet or internet sites. They enlist the help of Link Sleuth to ensure that site visitors don't receive error messages when navigating the sites.
Hausherr says his immediate plans for Link Sleuth include a maintenance release with some small enhancements that he will offer the marketplace. He adds that these improvements are already available to beta users from a user group mailing list. Users have been Hausherr's inspiration for making Link Sleuth a better product over time. "In improving the product over all these years, I've listened to a lot of wishes of the users," he says.
Because his main focus is on his career as a software developer, Hausherr says he doesn't have an outline of a 5-year growth plan for Link Sleuth. His goals are mainly focused on improving the functionality of the product and making it even easier to use for users. "What I would like to do if time allows is to replace the HTML parser because it is not really up to date," says Hausherr.
Regardless of what improvements Hausherr makes, the need for the core product is likely to continue over time. As websites compete for customers, the need to find and repair broken links will undoubtedly grow. Website operators will have to ensure that all of the links on their sites are functional, or they will risk losing customers to the competition.
Marji McClure is a freelance writer based in Connecticut. Send your comments about this column to itletters@infotoday .com.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||IT Spotlight|
|Article Type:||Company overview|
|Date:||Jun 1, 2008|
|Previous Article:||Google: designing a user-generated encyclopedia.|
|Next Article:||RefWorks adds enhancements to RefAware and RefWorks.|