I have what could be seen as a really stupid question, but could
someone please explain to me, why sniffing (scanning the user agent
for a match with a specific browser) Internet Explorer's user agent in
order to add compatibility with it is a bad idea? Obviously, I
understand the problems with sniffing for a particular browser and
making changes to a website that may break future versions of that
browser, and I recognize the added maintenance costs. The first,
however, is easily avoided by specifically limiting those changes to a
single version of a single browser (MSIE 7) and the latter is
certainly NOT avoided by using any other recommended technique, such
as conditionally commented out link to a CSS file.
Am I missing something here, or is it just that this practice is
frowned upon because it was historically used in... well, retarded
ways?
Thanks,
Adam
Determining a user-agent is not always, as you classify it, "a bad
idea." (In fact, mobile re-direction as we know it today depends on
filtering browsers by their user-agents. It is important to note the
type of re-direct, however, as Web-Server hard coded redirects based
on user-agent strings are usually welcomed over, say, the javascript
variety.)
What IS frowned upon, however, would be user-agent identification to
avoid cross-compatibility ---- i.e. being lazy with cross-browser-
standards and testing and redirecting users to different full versions
of a page based on the user agent string. Unless this is absolutely
necessary (and I can only think of a few, very specific reasons why it
would be), there can and should be a cross-browser alternative.
So, in short, the frowns come from a lack of 'elegance.' With good
CSS and XHTML practices, and standards checking the whole way, user-
agent filtering becomes less and less necessary. A page that relies
on such redirects is less desired than one that does not.
______________________
Ryan Dellolio
Yanaboo Enterprises, Consulting, Research
http://www.yanaboo.com/
http://ryan.yanaboo.com/