Generate sitemap files for ASP.Net controls. View reports on broken and redirected links (whereto and wherefrom). Scan static and dynamic websites such as portals, online stores, blogs and forums. Scan local and online websites on internet, localhost, LAN, CD-ROM and disks. Scan websites from multiple start paths, useful for websites not fully crosslinked.
sites that use mutliple domain names with same content. Supports crawler filters, robots.txt, custom connection and read timeout values, removal of session IDs, scanning of javascript and css files, proxy setup, website login, and various other options. Configure amount of simultaneous connections to use. Crawler is feature rich supporting many website crawling options.
Create text, HTML, RSS and XML sitemaps to help search engines like Google and Yahoo to crawl and index your website.