Oh no, not another police reference! Never mind, Dr. Watson checks your HTML against, lax, normal and strict criteria and allows you to include Netscape 4 and Explorer 4 specifics - or at least, lets you find out what will not work in a particular browser so that you can avoid it.
It also goes a couple of steps further in verifying your links, both for external HREFs and images. It will generate a word count, spell check non-HTML text and give you an estimate of download times. Very useful too, is the ability to report on your search engine compatibility and how popular your site is in regards to external links.
The report generated by Dr.Watson is very clear and concise giving a table of page download times for different modem speeds and a list of page and image link verifications. It only highlights suspect source code, giving suggestions and links to the context in a copy of the full source code at the bottom of the page. It reports all errors as being on line 1, which isn't terribly helpful and give me a very long list of `extended mark-up' attributes like `onMouseOver' which aren't recognised if you select `Browser extensions allowed - None'
What I found particularly interesting was that it was able to tell me that there are nearly 2000 links to my site from others! How it was able to do that in just a few seconds, I just don't know, but I am impressed.
Like Bobby, Dr. Watson still hasn't caught up with Mac Explorer 5 and Netscape 6 but it is a lot less intimidating than the W3C Validator's `death or glory' approach.
Dr. Watson- http://watson.addy.com/
So, there are three entirely different approaches to HTML syntax checking. Take your pick. You can check against adherence to `absolute' W3C standards, use the more `real-life' browser compatibility tests, or just upload your site to the server and cross your fingers.