A Firefox "bug" (633773 -
https://bugzilla.mozilla.org/show_bug.cgi?id=633773#c47) reported by the
same person who submitted a 29 page complaint against Google to the U.S.
Federal Trade Commission
(
http://online.wsj.com/public/resources/documents/FTCcomplaint100710.pdf
and
http://paranoia.dubfire.net/2010/10/my-ftc-complaint-about-googles-private.html)
was closed on 3/16 with a target release of Firefox 14. This adjustment
to the way Google searches are handled in Firefox will have significant
and sweeping ramifications for users and web publishers. To me it
appears one individual's agenda is being pushed through to all users by
source code changes in the Firefox browser. I would like the issues in
the below note seriously considered before the change is released to the
public in a forthcoming Firefox update.
In response to the last comment on the bug, I'm posting in this forum,
as requested:
[Sid Stamm [:geekboy] 2012-03-23 11:22:09 PDT
John A. Bilicki III, Virtual_ManPL: as abillings said, this is a fine discussion you're having but please take it over to mozilla.dev.platform.
That is a better forum for this discussion and you will get more engagement there than on a closed bug.
http://www.mozilla.org/about/forums/#dev-platform]
*Dear Mozilla & Google: Give users choice in search query tracking*
I'm all for user privacy and respecting sensitive search information.
However, my problem with enabling this change to https (no query
tracking) in Google search plugin by default is that it assumes a high
degree of user ignorance and does not give users choice to disclose
information or not. IMHO, this flies in the face of the very freedoms
and rights central to the spirit of the change. It eliminates choice,
and effectively places Mozilla into the position of "big brother" making
choices for "ignorant" or "uninformed" users. This is not democracy of
the web, this is decision making from the top down. It could actually
be considered public policy through source code, and it completely
side-steps the democratic process, freedom, and user choice.
I am a user of the Internet, of Firefox, and I am perfectly capable of
deciding whether or not I want search engines to disclose my search
queries to third parties or not. But with this change, I cannot make
that choice. It is effectively made for me by Mozilla. And that feels
too much like dictatorship, IMO.
If given the choice, I choose to "OPT IN" to letting search engines
disclose my queries to third parties. Why? Because I know that I have
a better online experience when my usage data is used by publishers to
create more compelling, relevant experiences for me. And as an online
marketer, I use search query data to create the kinds of experiences
users enjoy that helps them accomplish their goals quickly and efficiently.
To me this is about choice - and making this change takes that away from
users. It's a very slippery slope. What if Firefox decided it was in
my best interest to "protect" me from certain categories of content it
deemed inappropriate and blocked those by default without a way for me
to change the decision? I don't believe that would be tolerated. In
fact, when I encounter a potentially risky site, Mozilla warns me, but
still gives me choice to continue onto that site. It also allows me to
"Start Private Browsing" to shield my browsing and search history from
other users of the same computer, and informs me data might still be
shared with ISPs, employers, or other sites. These are good example of
giving users choice and informing them of the effects of their choices.
If I'm signed out of Google and I search with the Firefox search bar,
I'm okay with my queries being transmitted to third parties. If I don't
want that, then I should be able to click a button or otherwise activate
a choice to not be tracked (or just go to Google secure search).
Likewise, if I want to be tracked, then I should be able to specify that
also. I don't want software making decisions for me. And I don't
believe other users want that either.
Dear Mozilla & Google: Give users choices for how and when they want to
be tracked and how that data is shared with third parties or not.
An alternative to the fix made in this "bug" is allowing users to opt-in
or opt-out of "secure search" just like private browsing or other user
browser choices. Defaulting Firefox's Google search box to secure
search eliminates user choice, and this cannot be a good thing for the web.
Thanks for your consideration.
Anthony Long