Charlie Gibbs <cgi...@kltpzyxm.invalid> wrote:
> I'm running NoScript on my web browsers and have things
> locked down pretty tightly. Lately, though, more and more
> web sites have stopped working. It seems that JavaScript
> has become so pervasive that any attempts to block it cause
> many sites to fail. I use Seamonkey as my browser, since
> I don't like the way the Firefox user interface has been
> evolving - but more and more I find I have to grit my teeth
> and fire up a copy of Firefox (to which I haven't added any
> security options) in order to access sites that do (e.g.)
> online billing.
>
> From time to time I'll take a look at what NoScript is
> doing, and the ubiquitous Google comes up all over the
> place. I'm trying to opt out of the surveillance state -
> is this becoming an impossible dream?
Yes, in particular as soon a a site demands a Captcha, which nine
times out of ten is provided by Google these days, then you're
deliberately prevented from any means of getting through without
allowing Google scripts through (requires two reloads - as NoScript
only discovers the second Google script that's required after you
temporarily allow the first one). At least it doesn't force you to
allow Google Analytics.
Short of the Captcha case, it is sometimes possible to get around
script requirements by inspecting the source code, or simply viewing
the page without CSS which might uncover lots of hidden text and/or
links. I still have lots and lots of exemptions set in my NoScript
configuration though, plus those like Google that I regularly have
to allow through on a temporary basis for certain sites.
Then you've got some sites where the JS doesn't work in Firefox
even without extra privacy settings and add-ons. Sometimes those
sites still work in the last pre-Quantum Firefox ESR release.
Without denying that there are useful applications for it (only
a small percentage of what it gets used for, mind you), I wish that
client-side scripting had never been invented.
--
__ __
#_ < |\| |< _#