Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Hosts-file-blocked sites causing delays

361 views
Skip to first unread message

J. P. Gilliver (John)

unread,
Feb 6, 2013, 5:34:26 PM2/6/13
to
I have some sites (including facebook and twitter) blocked in my hosts
file.

Some web pages have long pauses, with "waiting for" (or something
similar - might be "waiting for response from") one of the blocked sites
showing in the status line.

Why is this? I thought putting something in the hosts file redirected
things to the local computer, so there should be no delay. So presumably
they're looking for some sort of script reply, or something?

Any suggestions how to avoid these delays? (At worst, I presume I'd have
to set up a web server on my machine?)
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

All I ask is to _prove_ that money can't make me happy.

Beauregard T. Shagnasty

unread,
Feb 6, 2013, 6:19:42 PM2/6/13
to
J. P. Gilliver (John) wrote:

> Any suggestions how to avoid these delays? (At worst, I presume I'd have
> to set up a web server on my machine?)

That might help. I've a web server on my computer and things blocked by
hosts react instantly. I don't remember what might have happened before I
had the server; that would have been in the last millennium.

If you do, naturally make sure it's not accessible from outside your LAN.

Depends on your OS, use LAMP, WAMP or XAMP(?).

There are the few sites out there that will fail, or warn you, if their
ads are not accessed. Uncommon, though.

It's not a Firefox problem.

--
-bts
-This space for rent, but the price is high

J. P. Gilliver (John)

unread,
Feb 6, 2013, 6:29:29 PM2/6/13
to
In message <4sidna1M3ciTeI_M...@mozilla.org>, Beauregard T.
Shagnasty <a.non...@example.invalid> writes:
>J. P. Gilliver (John) wrote:
>
>> Any suggestions how to avoid these delays? (At worst, I presume I'd have
>> to set up a web server on my machine?)
>
>That might help. I've a web server on my computer and things blocked by
>hosts react instantly. I don't remember what might have happened before I
>had the server; that would have been in the last millennium.
>
>If you do, naturally make sure it's not accessible from outside your LAN.
>
>Depends on your OS, use LAMP, WAMP or XAMP(?).

It seems excessive though, just to solve this problem.
>
>There are the few sites out there that will fail, or warn you, if their
>ads are not accessed. Uncommon, though.

That'd be fair enough. But most of the sites in question do work
eventually: they just pause a while while loading.
>
>It's not a Firefox problem.
>
Well, to me it is somewhat, in that it manifests when I'm trying to load
a web page. Firefox is doing _something_.

For example:
http://thechive.com/2012/04/16/awesome-dogs-are-awesome-30-photos/a-awesome-dog-5/
(from an email someone sent me)
spends _ages_ with "Connecting to connect.facebook.net.." in the status
line, then loads fine. (c.f.n is one of the entries in my hosts file.)
The wait shows the graded-colour background of the page, so something
has loaded.

Rob

unread,
Feb 6, 2013, 7:33:46 PM2/6/13
to
J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
> spends _ages_ with "Connecting to connect.facebook.net.." in the status
> line, then loads fine. (c.f.n is one of the entries in my hosts file.)

Probably you have some firewall or other local software on your
computer that prevents connects to http://127.0.0.1/ from failing
immediately.

When TCP packets to 127.0.0.1:80 are somehow filtered instead of
passed through, the connection will not fail immediately but after
a number of tries (a number of seconds).

When those packets take the usual route, they will be seen by your
local computer's TCP stack, which immetiately returns a RST to signal
that nothing is listening on port 80.

Try this:
open a command window
enter: telnet 127.0.0.1 80

This should immediately fail, not after some wait time.
Same for port 443.

VanguardLH

unread,
Feb 6, 2013, 8:02:22 PM2/6/13
to
"J. P. Gilliver (John)" wrote:

> I have some sites (including facebook and twitter) blocked in my hosts
> file.
>
> Some web pages have long pauses, with "waiting for" (or something
> similar - might be "waiting for response from") one of the blocked sites
> showing in the status line.
>
> Why is this? I thought putting something in the hosts file redirected
> things to the local computer, so there should be no delay. So presumably
> they're looking for some sort of script reply, or something?
>
> Any suggestions how to avoid these delays? (At worst, I presume I'd have
> to set up a web server on my machine?)

Don't use 127.0.0.1. That is localhost. Your host will actually try to
connect to a listening port on that host. That means timing out. It
should happen quickly but there is overhead. The idea was to point to
somewhere that no one was listening. I don't know why localhost became
the favorite. A better choice is 127.0.0.0. Can't have a server
listening on that port, no timeout waiting for a listening process that
isn't there. Your client won't even bother trying to connect to
127.0.0.0. I actually timed this and found 127.0.0.0 gives a faster
timeout than 127.0.0.1.

Of course, if you're using someone else's pre-compiled 'hosts' file,
like the one from MVPS, your choice is to leave all their entries
pointing at 127.0.0.1 and have your host timeout trying to make a
connection that won't happen or use an editor to change all 127.0.0.1 to
127.0.0.0 -- except you still need the "127.0.0.1 localhost" entry (so
change all and then change this one back). Plus later if you ever want
to run your own web server on your host then all those 127.0.0.1 entries
in the 'hosts' file would be connecting to your local web server.

127.0.0.1 represent the local host. It is your host. The 'hosts' file
was intended to point at hosts, not networks. 127.0.0.0 represents a
network, not a host, but it can still be used in the 'hosts' file. You
cannot connect to a network. You can only connect to a host. Your
client will instantly fail if the local DNS lookup via 'hosts' returns
127.0.0.0 because it cannot connect to a network versus your client
having to timeout at your local host's NIC interface to see there is no
process listing there.

When I specify 127.0.0.x where x = 1 to 254 (so the IP address is for a
host), it takes longer for the web browser to abort the connection
attempt. You see "Waiting for 127.0.0.x" or some other "hunt & connect"
status (which appears more than long enough to read it). Yet if
127.0.0.0 is used (for the network), the web browser fails instantly.
Since a web page could have dozens, or more, links to 3rd
party content that I am trying to nullify, it would seem faster rejects
(literally not trying at all to make the connects) would result in a
faster complete time to load (and render) the web page minus all the
blocked content.

Try it yourself. Have your web browser go to "http://127.0.0.1", notice
it status message, and how long before it errors. Then try to connect
to "http://127.0.0.0" and notice the web browser IMMEDIATELY fails.
Well, if it's faster on just one blocked local DNS lookup, imagine how
much faster a page would load if you did the same 127.0.0.0 return on
the dozens if not hundreds of off-domain links in a web page or scripts.

Personally I don't bother with using a 'hosts' file to block unwanted
content. The MVPS 'hosts' file, for example, is something like 20K
entries and searches through that file require an OPEN, READ, and CLOSE
operations and the search is linear from top to bottom (it is a file,
not a database). The size of these pre-compiled 'hosts' file have long
exceeded the recommended max of 8K entries after which the lookup could
take longer than the round-trip traffic for a DNS server request. The
list is of hosts and that means all those hosts at a domain. Just look
at how many are listed for doubleclick.com. You cannot use 'hosts' to
redirect a domain to localhost. You have to specify hostnames in there.
Plus no one that uses these pre-compiled lists ever reviews them to
verify that they they agree with the author(s) that compiled these
lists. There is no quick and easy means of disabling the client from
using the 'hosts' file. It's not like there's a toolbar button to
toggle your web browser from using and not using the 'hosts' file. You
may visit a site that doesn't render properly but it would if you had a
blank 'hosts' file (just the "127.0.0.1 localhost" entry). 'hosts' is a
rather clumsy means of blocking unwanted content.

By the way, there are many sites that don't merely have links to the
social sites. They actually run scripts to gather statistics for those
sites. That is, they're in league with them to aggregate stats on their
visitors and how many visit the social sites for which they provide nav
routes. Those scripts might be on- or off-domain of the site you visit
but they are still running. You might want to look at add-ons that
disable scripts (e.g., NoScript) and you pick which ones are allowed to
run at a site, or an add-on that disables the hosted social site scripts
(I think Ghostery is one where you just pick some options to block
hosted social site scripts but there may be other add-ons to do the same
thing). Just visiting a site will run those hosted social site scripts
so they know you visited the first site. They're gathering stats on
your visits. You don't even have to click on the Like or Tweet buttons
you see at some site you happen to visit for them to know you were
there.

You may find sites are more speedy when you disable their scripts. Too
many sites require scripts to work so I configure NoScript to temporary
allow scripts at the 2nd level domain (domain.tld) for sites that I
choose to visit. Off-domain scripts won't run unless I either
temporarily allow them or whitelist them. For example, many sites use
JQuery that is hosted at Google's Code site. Not running the hosted
scripts for the social sites on the sites that you visit may eliminate
those delays you've been experiencing.

Another choice is to use Adblock Plus, and add-on for Firefox. That
will disable the URLs in scripts hosted on the site you visit that
report to the social sites.

http://googleplus.wonderhowto.com/how-to/prevent-social-networks-from-tracking-your-internet-activities-0130280/

I have Adblock Plus (besides NoScript) and subscribe it to the
EasyPrivacy+EasyList and Malware Domains lists (plus I have a few of my
own blocking filters). It's up to you to decide if Adblock's policy on
defaulting to allows some ad sites to work matches with your goals.
Many sites wouldn't be around if it weren't for ad revenue. They have a
list of rules to which an ad source must comply to be polite in
presenting their content in a web page. If you're against all ads and
refuse to even help those providing a web site out of their own pockets
to give you information then disable the "Allow some non-intrusive
advertising". You can read about this policy and ABP deems
non-intrusive by reading https://adblockplus.org/en/acceptable-ads.

Note that using Adblock will significantly increase the memory footprint
for Firefox along with slowing the load of Firefox. Where Firefox
might've loaded and be ready in under 2 seconds will take 6 seconds with
ABP enabled. The longer the blocklists the larger the firefox.exe
process will be. When I used ABP with only my short list of blocked
domains, Firefox loaded faster and consumed much less memory. When I
add the blocklists, and with each one that is subscribed, Firefox takes
longer to load and eats more memory. So you'll have to decide if the
longer start time and increased memory usage is a sacrifice that is
outweighed by the time saved to render the web sites you visit within
one session of Firefox. At least with ABP, however, you can easily
disable it to get a page working whereas with a 'hosts' file you'll have
to unload the web browser, rename or delete the 'hosts' file, reload the
web browser, and then revisit the web page to see if it now works
without all the blocks.

Beauregard T. Shagnasty

unread,
Feb 6, 2013, 8:01:59 PM2/6/13
to
J. P. Gilliver (John) wrote:

> Beauregard T. Shagnasty writes:
>>J. P. Gilliver (John) wrote:
>>> Any suggestions how to avoid these delays? (At worst, I presume I'd
>>> have to set up a web server on my machine?)
>>
>>Depends on your OS, use LAMP, WAMP or XAMP(?).
>
> It seems excessive though, just to solve this problem.

I'd agree with that, as most users do not have such a server. Only us web
authors do those.

>>It's not a Firefox problem.
>>
> Well, to me it is somewhat, in that it manifests when I'm trying to load
> a web page. Firefox is doing _something_.

How about with other browsers? Which other browsers have you tested on the
page in question?

> For example:
> http://thechive.com/2012/04/16/awesome-dogs-are-awesome-30-photos/a-
awesome-dog-5/
> (from an email someone sent me)

Loads instantly for me. Dogs in a circle. It's a wordpress page, and it
even tells me "generated in 0.889 seconds".

> spends _ages_ with "Connecting to connect.facebook.net.." in the status
> line, then loads fine. (c.f.n is one of the entries in my hosts file.)
> The wait shows the graded-colour background of the page, so something
> has loaded.

You mean the entire dogs page doesn't load until later? Not just a portion
of it?

It loads a script from Facebook (I hate the proliferation of that!).

<script src="http://connect.facebook.net/en_US/all.js" type="text/
javascript"></script> .. which is a *178,827 byte* file. <sheesh>

g

unread,
Feb 6, 2013, 8:32:39 PM2/6/13
to mozilla firefox support

On 02/07/2013 01:02 AM, VanguardLH wrote:
<>

> Don't use 127.0.0.1. That is localhost.

strange how one can be trying something that another is and both
having problems.

my problem is a bit different in that many of the sites i have in
'host' file do not get redirected to 127.0.0.0 or 127.0.0.1. they
go straight to site i am trying to eliminate.

suggestions?

--

peace out.

tc.hago,

g
.

in a free world without fences, who needs gates.

signature.asc

Barbara

unread,
Feb 6, 2013, 8:46:48 PM2/6/13
to
That site (techarchive) is calling WordPress, which I have noticed also
causes lags/delays on my win 7 computer. On my old XP computer, it was
incredibly slow to load anything using WordPress.
I also use AdMuncher, which blocks a lot of ads and other unwanted
stuff.

Barbara

Refrsxf

unread,
Feb 6, 2013, 8:56:43 PM2/6/13
to
"VanguardLH" <V...@nguard.LH> wrote in message
news:9eidnb_-lJB7YY_M...@mozilla.org...
> Don't use 127.0.0.1. That is localhost. Your host will actually try to
> connect to a listening port on that host. That means timing out. It
> should happen quickly but there is overhead. The idea was to point to
> somewhere that no one was listening. I don't know why localhost became
> the favorite. A better choice is 127.0.0.0.
> ...
> Try it yourself. Have your web browser go to "http://127.0.0.1", notice
> it status message, and how long before it errors. Then try to connect
> to "http://127.0.0.0" and notice the web browser IMMEDIATELY fails.

Hey, thanks. That never occurred to me. I only have about 10 entries in my
host file, but they're all getting converted to 127.0.0.0 (cept localhost).

---


»Q«

unread,
Feb 6, 2013, 9:12:59 PM2/6/13
to
On Wed, 6 Feb 2013 22:34:26 +0000
"J. P. Gilliver (John)" <G6...@soft255.demon.co.uk> wrote:

> I have some sites (including facebook and twitter) blocked in my
> hosts file.
>
> Some web pages have long pauses, with "waiting for" (or something
> similar - might be "waiting for response from") one of the blocked
> sites showing in the status line.
>
> Why is this? I thought putting something in the hosts file redirected
> things to the local computer, so there should be no delay. So
> presumably they're looking for some sort of script reply, or
> something?
>
> Any suggestions how to avoid these delays? (At worst, I presume I'd
> have to set up a web server on my machine?)

IMO, Rob posted a great explanation of the most likely cause of the
delays. As I guess you know, the hosts file concept wasn't implemented
with blocking in mind; it was supposed to provide quick lookups of
known IPs, not block addresses by directing traffic to localhost.

An easy (compared to installing a local server) workaround should be to
install AdBlock Plus* and use it to block all the sites listed in
hosts. That should prevent Firefox from making any queries for them,
so you won't have to wait while your networking system tries to
communicate with itself.

That should work with hosts intact, but to solve the underlying issue,
I'd clear the hosts file as well and find other ways to block whatever
needs blocking.

* https://adblockplus.org/




Beauregard T. Shagnasty

unread,
Feb 6, 2013, 9:41:42 PM2/6/13
to
»Q« wrote:

> An easy (compared to installing a local server) workaround should be to
> install AdBlock Plus* and use it to block all the sites listed in hosts.

..then I would have to install Adblock Plus in all my browsers. <g> No,
I wouldn't bother doing that; I'll just use the hosts file.

J. P. Gilliver (John)

unread,
Feb 7, 2013, 2:28:46 AM2/7/13
to
In message <slrnkh5tja...@xs8.xs4all.nl>, Rob
<nom...@example.com> writes:
>J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
>> spends _ages_ with "Connecting to connect.facebook.net.." in the status
>> line, then loads fine. (c.f.n is one of the entries in my hosts file.)
>
>Probably you have some firewall or other local software on your
>computer that prevents connects to http://127.0.0.1/ from failing
>immediately.

Interesting. Thanks for coming back. When I click on the above "link",
there is a pause of less than a second, then Firefox comes back with
Unable to connect.
>
>When TCP packets to 127.0.0.1:80 are somehow filtered instead of
>passed through, the connection will not fail immediately but after
>a number of tries (a number of seconds).
>
>When those packets take the usual route, they will be seen by your
>local computer's TCP stack, which immetiately returns a RST to signal
>that nothing is listening on port 80.
>
>Try this:
>open a command window
>enter: telnet 127.0.0.1 80
>
>This should immediately fail, not after some wait time.
>Same for port 443.

For both of those, my firewall pops up a "do you want to allow" window;
as soon as I say "deny", the Connect failed comes back immediately. When
I'm accessing the web pages that include something from my hosts-blocked
list, nothing pops up from the firewall (very simple old one - Kerio
2.1.5) - just a _long_ pause.

Thanks again for responding.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

"This situation absolutely requires a really futile and stoopid gesture be done
on somebody's part." "We're just the guys to do it." Eric "Otter" Stratton (Tim
Matheson) and John "Bluto" Blutarsky (John Belushi) - N. L's Animal House
(1978)

J. P. Gilliver (John)

unread,
Feb 7, 2013, 2:37:54 AM2/7/13
to
In message
<mailman.673.1360200650...@lists.mozilla.org>, g
<gel...@bellsouth.net> writes:
>
>On 02/07/2013 01:02 AM, VanguardLH wrote:
><>
>
>> Don't use 127.0.0.1. That is localhost.
>
>strange how one can be trying something that another is and both
>having problems.
>
>my problem is a bit different in that many of the sites i have in
>'host' file do not get redirected to 127.0.0.0 or 127.0.0.1. they
>go straight to site i am trying to eliminate.
>
>suggestions?
>
How do you know they are going there - do they succeed and display
content? If you're just going by the status line, that will still say
"connecting to <blocked.site>", even if hosts is redirecting it to
local.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Christoph Schmees

unread,
Feb 7, 2013, 6:18:41 AM2/7/13
to
On 07.02.2013 02:01, Beauregard T. Shagnasty wrote:
> ...
> It loads a script from Facebook (I hate the proliferation of that!).
>
> <script src="http://connect.facebook.net/en_US/all.js" type="text/
> javascript"></script> .. which is a *178,827 byte* file. <sheesh>
>

NoScript is your friend. I have set f.c.book to "not trustworthy".

Christoph

--
email:
nurfuerspam -> gmx
de -> net

g

unread,
Feb 7, 2013, 9:19:28 AM2/7/13
to mozilla firefox support

On 02/07/2013 07:37 AM, J. P. Gilliver (John) wrote:
> In message
> <mailman.673.1360200650...@lists.mozilla.org>, g
> <gel...@bellsouth.net> writes:
<>

>> strange how one can be trying something that another is and both
>> having problems.
>>
>> my problem is a bit different in that many of the sites i have in
>> 'host' file do not get redirected to 127.0.0.0 or 127.0.0.1. they
>> go straight to site i am trying to eliminate.
>>
>> suggestions?
>>
> How do you know they are going there - do they succeed and display
> content?

yes

> If you're just going by the status line, that will still say
> "connecting to <blocked.site>", even if hosts is redirecting it to
> local.

not what is happening.
signature.asc

J. P. Gilliver (John)

unread,
Feb 7, 2013, 4:40:46 PM2/7/13
to
In message <VuCdnWolZrSaYI_M...@mozilla.org>, Beauregard T.
Shagnasty <a.non...@example.invalid> writes:
>J. P. Gilliver (John) wrote:
[]
>> For example:
>> http://thechive.com/2012/04/16/awesome-dogs-are-awesome-30-photos/a-
>awesome-dog-5/
>> (from an email someone sent me)
>
>Loads instantly for me. Dogs in a circle. It's a wordpress page, and it
>even tells me "generated in 0.889 seconds".
>
>> spends _ages_ with "Connecting to connect.facebook.net.." in the status
>> line, then loads fine. (c.f.n is one of the entries in my hosts file.)
>> The wait shows the graded-colour background of the page, so something
>> has loaded.
>
>You mean the entire dogs page doesn't load until later? Not just a portion
>of it?

I've just tried it again. I got, more or less immediately, the graded
background (black-to-green). I also got, in the bottom right corner, a
sort of arrow-and-line symbol. Nothing else (not even a scrollbar).
Then, after almost exactly a minute (I think it was faster yesterday),
the page seems to pop up - the black bar at the top, the dogs picture in
a frame, the other frames, etc., and a scrollbar. It still says
"Connecting to connect.facebook.net" in the status line; after a similar
period (I didn't time it this time, I was busy typing this), the stripe
(blackish, slightly transparent) across the bottom of the window
(window, not page) appeared telling me to LIKE something.
>
>It loads a script from Facebook (I hate the proliferation of that!).
>
><script src="http://connect.facebook.net/en_US/all.js" type="text/
>javascript"></script> .. which is a *178,827 byte* file. <sheesh>
>
Sheesh indeed.

I just tried changing that entry - and another it paused at - in my
hosts file from 127.0.0.1 to 127.0.0.0, as another has suggested; that
made no difference, it still spent a long time "connecting" to them.

FWIW, I do have Adblock Plus.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Can a blue man sing the whites?

Beauregard T. Shagnasty

unread,
Feb 7, 2013, 5:00:35 PM2/7/13
to
J. P. Gilliver (John) wrote:

> I've just tried it again. ....

Sorry, I don't know what else to tell you... :-/

oh, other than remove the 'connect.facebook.net' from your hosts file and
see what happens.

VanguardLH

unread,
Feb 7, 2013, 5:12:37 PM2/7/13
to
"Beauregard T. Shagnasty" wrote:

> �Q� wrote:
>
>> An easy (compared to installing a local server) workaround should be to
>> install AdBlock Plus* and use it to block all the sites listed in hosts.
>
> ..then I would have to install Adblock Plus in all my browsers. <g> No,
> I wouldn't bother doing that; I'll just use the hosts file.

If you not using the huge pre-compiled 'hosts' files, like the one from
MVPS, and instead adding your own entry, why not block the domains
(instead of a long list of hosts at each domain) up at your DNS server?
Instead of using your ISP's DNS server, use OpenDNS, create an account
there, install their DNS updater client (so they know what is your
current IP address address reflected in your account to enforce your DNS
config that you define up there), and then start adding domains to the
blocklist in your OpenDNS account.

You can configure just your own host(s) to use OpenDNS. For your
network connectoids, in their properties, go to the properties of the
TCP/IP component. Instead of using DHCP to assign you a DNS server, use
static DNS server. Add the ones for OpenDNS (at the top of the DNS list
or as the only DNS servers). Alternative, you could configure your
router to use OpenDNS instead of it using DHCP to get the DNS server
from your ISP. Then leave all your hosts to use DHCP (from your
router). All DNS requests will end up going to OpenDNS. Your account
there has your current IP address (hence the need for their updater
client). They don't know who anyone is that connects to them except by
the client's IP addres. So they have to match up the IP address you're
using now with the one recorded in your OpenDNS account so they can then
apply your account settings against your DNS requests.

With a free OpenDNS account, you can block up to 50 domains. It's not a
high count but remember that you are blocking on domains, not the
thousands of hosts on those domains. I can enter "doubleclick.com" and
block on all of the dozens of hostnames that the MVPS 'hosts' file would
list. You use your DNS service to block on domains.

Another choice is get a firewall, anti-virus, or other security program
that includes URL blocking. If you are using a firewall, check if it
has URL blocking. For example, where you can specify
"*.doubleclick.com*" (with or without wildcards) to block on that string
in any URL. If your firewall doesn't have this, maybe it's time to
rethink which firewall you use. I don't bother with firewalls since, to
me, they're only useful beyond the one included in Windows for
regulating what processes on my host are permitted *outbound*
connections -- I don't need further protection against inbound
connections. I use Avast Free which has a URL Blocking list. I can
enter domains there. If I block on doubleclick.com then anytime I visit
a page with URLs to there will get that content blocked. I can't even
go to the Doubleclick web site (where it would be 1st party content)
whether I use Avast's URL blocklist or my DNS provider's blocklist.

VanguardLH

unread,
Feb 7, 2013, 5:17:22 PM2/7/13
to
"g" wrote:

> my problem is a bit different in that many of the sites i have in
> 'host' file do not get redirected to 127.0.0.0 or 127.0.0.1. they
> go straight to site i am trying to eliminate.

The 'hosts' file does not work on domains. It works on hosts. You must
specify the FQDN as an entry in the 'hosts' file. That's why, for
example, there isn't just one entry for doubleclick.com in the MVPS
'hosts' file but over 50 of them to list a long list of hosts that
DoubleClick uses now (but can change later). After Google's acquisition
of Doubleclick, and with the army of hosts used by Google, there's even
more entries in that pre-compiled 'hosts' file from MVPS. In fact,
their nameserver could accept ANY hostname to redirect the inbound
connect to their web server farm, so no number of entries in the 'hosts'
file for that domain will ever cover every possible hostname they may
use in the web pages at other sites that you visit. They don't even
have to specify a host.

doubleclick.com won't work to block on ads-serv.doubleclick.com.
They're not the same host. The first one doesn't even specify a host.

Without knowing what is the entry in the 'hosts' file that you believe
apply to the block and what page you are rendering to see from where the
3rd party (off-domain) content is sourced then there's no way to see if
there was a mismatch.

Although you might know the content in a page was sourced from some
off-domain location, it won't be off-domain if the site tunnels the
content through that site. That is, that off-site content looks like it
came from on-site. Many sites also dynamically generate pages using
server-side scripts so all of the page content looks to have come from
the site that you visit (which obviously you wouldn't be blocking in the
'hosts' file).

In the example site you noted before at:

http://thechive.com/2012/04/16/awesome-dogs-are-awesome-30-photos/a-awesome-dog-5/

you said the web browser was waiting on connect.facebook.com. You said
that hostname was in your 'hosts' file. Okay but that's a *script*
source, not a content source as in text, images, HTML, etc. That page
also gets content from www.facebook.com. Do you have THAT host listed
in your 'hosts' file?

It wasn't the script you really wanted to block (although any content
delivered by it you might want blocked). It's the content and that's
comming from a different host than for the script.

How many host entries do you have in your 'hosts' file for the
facebook.com domain? I just looked at the MVPS 'hosts' file and they
don't even have entries for the connect and www hostnames at
facebook.com so you're using someone else's pre-compiled 'hosts' file or
compiling your own. In either case, you'll need to add the www host at
facebook.com if you want to block content from there on the thearchive
page that you visited.

Where is the 'hosts' file physcially located in the file system?

After modifying the 'hosts' file, have you:
- Ran "ipconfig /flushdns"?
- Look at its file attributes (attrib hosts)? It should only have the A
(archive) attribute set on it (if it's set). Hidden (H), System (S),
and Read-Only (R) should not be set.
- Purge your web browser's cache yet?

J. P. Gilliver (John)

unread,
Feb 7, 2013, 5:16:29 PM2/7/13
to
In message <9eidnb_-lJB7YY_M...@mozilla.org>, VanguardLH
<V...@nguard.LH> writes:
>"J. P. Gilliver (John)" wrote:
>
>> I have some sites (including facebook and twitter) blocked in my hosts
>> file.
[]
>> Any suggestions how to avoid these delays? (At worst, I presume I'd have
>> to set up a web server on my machine?)
>
>Don't use 127.0.0.1. That is localhost. Your host will actually try to
>connect to a listening port on that host. That means timing out. It
>should happen quickly but there is overhead. The idea was to point to
>somewhere that no one was listening. I don't know why localhost became
>the favorite. A better choice is 127.0.0.0. Can't have a server
>listening on that port, no timeout waiting for a listening process that
>isn't there. Your client won't even bother trying to connect to
>127.0.0.0. I actually timed this and found 127.0.0.0 gives a faster
>timeout than 127.0.0.1.

I tried it with, as you suggested, http://127.0.0.1 and 0; in both cases
Firefox came back almost immediately with some sort of can't message,
slightly more quickly with .0 than .1 .
>
>Of course, if you're using someone else's pre-compiled 'hosts' file,
>like the one from MVPS, your choice is to leave all their entries
>pointing at 127.0.0.1 and have your host timeout trying to make a

I'm not - it's one I add to myself from time to time. It only has about
60 (active) lines in it - mostly aliases of facebook and twitter, with
four advertising ones. I might take your suggestion of changing them all
to 127.0.0.0, but I have tried changing a couple that the test page
tries to connect to to that, and it made no noticeable difference - the
page still paused for ages while it tried.

>connection that won't happen or use an editor to change all 127.0.0.1 to
>127.0.0.0 -- except you still need the "127.0.0.1 localhost" entry (so
>change all and then change this one back). Plus later if you ever want
>to run your own web server on your host then all those 127.0.0.1 entries
>in the 'hosts' file would be connecting to your local web server.
>
>127.0.0.1 represent the local host. It is your host. The 'hosts' file
>was intended to point at hosts, not networks. 127.0.0.0 represents a

Yes, I know that the original intention of hosts was to act as a sort of
local address book to save you having to call "directory enquiries" (UK
term - I think US uses a different term) alias a DNS, for the hosts
contained in it.

>network, not a host, but it can still be used in the 'hosts' file. You
>cannot connect to a network. You can only connect to a host. Your
>client will instantly fail if the local DNS lookup via 'hosts' returns

unfortunately, my client - Firefox - didn't _instantly_ fail.

>127.0.0.0 because it cannot connect to a network versus your client
>having to timeout at your local host's NIC interface to see there is no
>process listing there.

I fear something else is happening: code on the part of the page I am
loading is expecting something from the blocked site(s), I think. At
least, that's the explanation I can think of.
[]
>127.0.0.0 is used (for the network), the web browser fails instantly.

Mine doesn't )-:.
[]
>
>Try it yourself. Have your web browser go to "http://127.0.0.1", notice
>it status message, and how long before it errors. Then try to connect

Two or three seconds, I'd say.

>to "http://127.0.0.0" and notice the web browser IMMEDIATELY fails.

Near enough instantly, agreed.

>Well, if it's faster on just one blocked local DNS lookup, imagine how
>much faster a page would load if you did the same 127.0.0.0 return on
>the dozens if not hundreds of off-domain links in a web page or scripts.

Hang on ... just changing _all_ my 127.0.0.1s to .0s in hosts ...
(except the localhost one) ... done. Now to reload the test page ... 19
seconds "connecting" to www.google-analytics.com, then 44 seconds
connect.facebook.net, before anything other than the background appears.
>
>Personally I don't bother with using a 'hosts' file to block unwanted
>content. The MVPS 'hosts' file, for example, is something like 20K
>entries and searches through that file require an OPEN, READ, and CLOSE
>operations and the search is linear from top to bottom (it is a file,
>not a database). The size of these pre-compiled 'hosts' file have long
>exceeded the recommended max of 8K entries after which the lookup could
>take longer than the round-trip traffic for a DNS server request. The

So my 60-odd entries shouldn't ...

>list is of hosts and that means all those hosts at a domain. Just look
>at how many are listed for doubleclick.com. You cannot use 'hosts' to
>redirect a domain to localhost. You have to specify hostnames in there.

(You've lost me. Don't domains have hostnames? Or maybe vice versa?)
[]
>blank 'hosts' file (just the "127.0.0.1 localhost" entry). 'hosts' is a
>rather clumsy means of blocking unwanted content.

But rather intellectually satisfying.
>
>By the way, there are many sites that don't merely have links to the
>social sites. They actually run scripts to gather statistics for those
>sites. That is, they're in league with them to aggregate stats on their
>visitors and how many visit the social sites for which they provide nav
>routes. Those scripts might be on- or off-domain of the site you visit
>but they are still running. You might want to look at add-ons that

That is what I _suspect_ is happening. I'm just a bit puzzled that the
status line says "connecting to" a hosts-blocked site for such a long
time.

>disable scripts (e.g., NoScript) and you pick which ones are allowed to
>run at a site, or an add-on that disables the hosted social site scripts

Wouldn't "hosts"ing those sites (even though you don't approve of doing
it that way) stop those scripts being fetched though?
[]
>Another choice is to use Adblock Plus, and add-on for Firefox. That
>will disable the URLs in scripts hosted on the site you visit that
>report to the social sites.

Installed.
[]
>Note that using Adblock will significantly increase the memory footprint
>for Firefox along with slowing the load of Firefox. Where Firefox

If it is doing, neither is bothering me.

>might've loaded and be ready in under 2 seconds will take 6 seconds with
>ABP enabled. The longer the blocklists the larger the firefox.exe

(I usually load it once per session and then leave it running, so load
time doesn't usually impinge.)
[]
>disable it to get a page working whereas with a 'hosts' file you'll have
>to unload the web browser, rename or delete the 'hosts' file, reload the
>web browser, and then revisit the web page to see if it now works
>without all the blocks.

Ah, I hadn't been unloading Firefox. Just restarted it (which was pretty
quick, certainly not enough to consider it a problem), and the dogs page
still shows connecting to connect.facebook.net for ages before anything
other than the graded background appears.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

J. P. Gilliver (John)

unread,
Feb 7, 2013, 5:21:27 PM2/7/13
to
In message <ZZSdneCVjP2euYnM...@mozilla.org>, Beauregard T.
Shagnasty <a.non...@example.invalid> writes:
>J. P. Gilliver (John) wrote:
>
>> I've just tried it again. ....
>
>Sorry, I don't know what else to tell you... :-/
>
>oh, other than remove the 'connect.facebook.net' from your hosts file and
>see what happens.
>
Page loaded much more quickly )-:!
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

After I'm dead I'd rather have people ask why I have no monument than why I have
one. -Cato the Elder, statesman, soldier, and writer (234-149 BCE)

J. P. Gilliver (John)

unread,
Feb 7, 2013, 5:26:00 PM2/7/13
to
In message <9--dnfzFgOFWu4nM...@mozilla.org>, VanguardLH
<V...@nguard.LH> writes:
[]
>If you not using the huge pre-compiled 'hosts' files, like the one from
>MVPS, and instead adding your own entry, why not block the domains
>(instead of a long list of hosts at each domain) up at your DNS server?
>Instead of using your ISP's DNS server, use OpenDNS, create an account
[]
It seems that I _am_ successfully blocking sites, and that is what is
slowing the page )-:! So what is the reason for that?
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

J. P. Gilliver (John)

unread,
Feb 7, 2013, 5:44:39 PM2/7/13
to
In message <VZKdncvXNdZ6uonM...@mozilla.org>, VanguardLH
<V...@nguard.LH> writes:
[]
>In the example site you noted before at:
>
>http://thechive.com/2012/04/16/awesome-dogs-are-awesome-30-photos/a-awes
>ome-dog-5/
>
>you said the web browser was waiting on connect.facebook.com. You said
>that hostname was in your 'hosts' file. Okay but that's a *script*
>source, not a content source as in text, images, HTML, etc. That page
>also gets content from www.facebook.com. Do you have THAT host listed
>in your 'hosts' file?

All I know is that the status line - in Firefox - says it is waiting for
that domain (if that's the right word), for an unconscionably long time,
if I have that domain (or rather name) in my hosts file, and doesn't if
I don't. So the blocking as such is, I think, working.
>
>It wasn't the script you really wanted to block (although any content
>delivered by it you might want blocked). It's the content and that's
>comming from a different host than for the script.

No, _I_ want to block my system from accessing that site at all; I want
to minimise (I realise I probably can't stop completely) that site
receiving indications of my existence, and certainly of my browsing
habits. I'm just puzzled why doing so - successfully, apparently -
causes long delays, with the browser apparently waiting on them.
>
>How many host entries do you have in your 'hosts' file for the
>facebook.com domain? I just looked at the MVPS 'hosts' file and they

About 50; mostly something.facebook.com, some something.fbcdn.net, and a
few further variations.

>don't even have entries for the connect and www hostnames at
>facebook.com so you're using someone else's pre-compiled 'hosts' file or
>compiling your own. In either case, you'll need to add the www host at

My own, but I do take suggestions I come across.

>facebook.com if you want to block content from there on the thearchive
>page that you visited.

I certainly have www.facebook.com among them.
>
>Where is the 'hosts' file physcially located in the file system?

For some reason (XP here), it's C:\WINDOWS\system32\drivers\etc\hosts.
>
>After modifying the 'hosts' file, have you:
>- Ran "ipconfig /flushdns"?
>- Look at its file attributes (attrib hosts)? It should only have the A
> (archive) attribute set on it (if it's set). Hidden (H), System (S),
> and Read-Only (R) should not be set.
>- Purge your web browser's cache yet?

It's fairly clear that the blocking is working, without doing any of
those, since commenting out (putting a # before) one of the lines,
resaving the hosts file, and reloading the page in Firefox, changed how
the page loaded (it loaded without any of the long waits with the line
commented out).
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

VanguardLH

unread,
Feb 7, 2013, 5:49:32 PM2/7/13
to
By the way, what do the facebook entries look like in your 'hosts' file?

I've found problems with parsing the 'hosts' file on different versions
of Windows if you put spacing in to make it more legible. In general:

- No whitespace before the "127.0.0.x" string in the line (or whatever
IP address you put there). The IP address must start in column 1 of the
line.

- Use either 1 tab or 1 space character after the IP address. Do not
use multiple whitespace characters (to make it look pretty to you).

- Do not attempt to put a comment within an entry line. That is, do not
add "# <comment>" after the entry. The # comment delimiter should
appear in column 1 and comments out the entire line.


Wrong:
127.0.0.1 www.somesite.com (leading spaces)
127.0.0.1 www.somesite.com (multiple spaces between fields)
127.0.0.1 www.somesite.com (2 tabs between fields)
127.0.0.1 www.somesite.com^^^ (^ = space; trailing spaces)
127.0.0.1 somesite.com (domain, not a *hostname*)

End-of-line is CR-LF, not just CR. Use Notepad to edit, not Wordpad.

A huge 'hosts' file will slow Win7 local lookups (heavy load by svchost)
more than it did back on WinXP. You didn't mention your OS.

I'm wondering if a caching proxy couldn't be at fault (never used one so
am not familiar with how they cache off-domain content for a web page
that you visit and gets cached). You'll be re-retrieving prior content
from the proxy rather than re-rendering the web page to then block the
3rd party (off-domain) content. The proxy doesn't bother getting the
3rd party content again. Some ISPs use a caching proxy to speed up the
web experience to their customers. If it's your proxy, you would have
to do the ad blocking upstream of it (or as rules inside that caching
proxy). In fact, using your own proxy (e.g., privoxy) lets you do
whatever blocking you want without having to bother with a common add-on
to multiple web browsers (and the add-on may not be available across all
your webcentric apps). Blocking upstream of all your apps (even better
if upstream of all your hosts) eliminates having to use multiple
instances of app-specific software to do the blocking.

J. P. Gilliver (John)

unread,
Feb 7, 2013, 6:19:20 PM2/7/13
to
In message <3fOdnewbhI_2sonM...@mozilla.org>, VanguardLH
<V...@nguard.LH> writes:
>By the way, what do the facebook entries look like in your 'hosts' file?
>
>I've found problems with parsing the 'hosts' file on different versions
>of Windows if you put spacing in to make it more legible. In general:
[]
The hosts file is working: when I put a # in front of the line, it
changes how the test page loads.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Microsoft announced recently that it had lost the source code to MS-DOS (Peter
Jackson, PC Magazine December 1998 [p. 29])

VanguardLH

unread,
Feb 7, 2013, 6:26:52 PM2/7/13
to
"J. P. Gilliver (John)" wrote:

> VanguardLH writes:
>
>> After modifying the 'hosts' file, have you:
>> - Ran "ipconfig /flushdns"?
>> - Look at its file attributes (attrib hosts)? It should only have the A
>> (archive) attribute set on it (if it's set). Hidden (H), System (S),
>> and Read-Only (R) should not be set.
>> - Purge your web browser's cache yet?
>
> It's fairly clear that the blocking is working, without doing any of
> those, since commenting out (putting a # before) one of the lines,
> resaving the hosts file, and reloading the page in Firefox, changed how
> the page loaded (it loaded without any of the long waits with the line
> commented out).

The method of blocking you are talking about demands a [local or server]
DNS lookup. If the IP address is already known then there is no DNS
lookup. You aren't blocking on IP addresses. You are blocking on
hostnames. Humans like hostnames. Computers only use IP addresses.

If you've retrieved content from a source and its IP address is known,
that IP address gets reused from the DNS cache instead of doing yet
another DNS lookup. It is much faster to do a local lookup from a cache
than to requery the server (whether it be a DNS server or sequential
reads through the 'hosts' file). That's why I mentioned flushing both
the DNS cache and your web browser. The wrong attributes on the 'hosts'
file can also make it unreadable (won't be found).

You want to make sure that your client is using only a hostname and that
the IP address for it is not still available in the DNS cache or in your
client's cache.

Beauregard T. Shagnasty

unread,
Feb 7, 2013, 8:06:11 PM2/7/13
to
J. P. Gilliver (John) wrote:

> Beauregard T. Shagnasty writes:
>>J. P. Gilliver (John) wrote:
>>> I've just tried it again. ....
>>
>>Sorry, I don't know what else to tell you... :-/
>>
>>oh, other than remove the 'connect.facebook.net' from your hosts file
>>and see what happens.
>>
> Page loaded much more quickly )-:!

Whoohoo! Problem solved ... except you'll garner that Facepuke script
every time you visit the page... :-(

Unfortunately, I am not allowed to block Facepuke on my computer. SWMBO
uses my computer and that site to communicate with daughter, play Scrabble
with her, etc.

I may have missed it; did you answer if other browsers hang up like Fx
when you have Facepuke in the hosts file?

J. P. Gilliver (John)

unread,
Feb 8, 2013, 4:43:43 PM2/8/13
to
In message <luOdnUbYQq0e0onM...@mozilla.org>, Beauregard T.
Shagnasty <a.non...@example.invalid> writes:
>J. P. Gilliver (John) wrote:
>
>> Beauregard T. Shagnasty writes:
>>>J. P. Gilliver (John) wrote:
>>>> I've just tried it again. ....
>>>
>>>Sorry, I don't know what else to tell you... :-/
>>>
>>>oh, other than remove the 'connect.facebook.net' from your hosts file
>>>and see what happens.
>>>
>> Page loaded much more quickly )-:!
>
>Whoohoo! Problem solved ... except you'll garner that Facepuke script
>every time you visit the page... :-(

Yes, but the wrong problem has been solved. I want to have nothing to do
with them; I am still wondering why doing so takes so long. What is
waiting for what?
>
>Unfortunately, I am not allowed to block Facepuke on my computer. SWMBO
>uses my computer and that site to communicate with daughter, play Scrabble
>with her, etc.
>
>I may have missed it; did you answer if other browsers hang up like Fx
>when you have Facepuke in the hosts file?
>
No, I didn't, as I don't use any other browser.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

"Mr. Spock succumbs to a powerful mating urge and nearly kills Captain Kirk."
- TV Guide description of Amok Time Trek episode.

J. P. Gilliver (John)

unread,
Feb 8, 2013, 4:52:41 PM2/8/13
to
In message <ZZadnYmWJK65pYnM...@mozilla.org>, VanguardLH
<V...@nguard.LH> writes:
>"J. P. Gilliver (John)" wrote:
>
>> VanguardLH writes:
>>
>>> After modifying the 'hosts' file, have you:
>>> - Ran "ipconfig /flushdns"?
>>> - Look at its file attributes (attrib hosts)? It should only have the A
>>> (archive) attribute set on it (if it's set). Hidden (H), System (S),
>>> and Read-Only (R) should not be set.
>>> - Purge your web browser's cache yet?
>>
>> It's fairly clear that the blocking is working, without doing any of
>> those, since commenting out (putting a # before) one of the lines,
>> resaving the hosts file, and reloading the page in Firefox, changed how
>> the page loaded (it loaded without any of the long waits with the line
>> commented out).
>
>The method of blocking you are talking about demands a [local or server]
>DNS lookup. If the IP address is already known then there is no DNS
>lookup. You aren't blocking on IP addresses. You are blocking on
>hostnames. Humans like hostnames. Computers only use IP addresses.

The pages that call facebook etc. do so by name, not number (probably
because they are written by humans).
>
>If you've retrieved content from a source and its IP address is known,
>that IP address gets reused from the DNS cache instead of doing yet
>another DNS lookup. It is much faster to do a local lookup from a cache
>than to requery the server (whether it be a DNS server or sequential
>reads through the 'hosts' file). That's why I mentioned flushing both
>the DNS cache and your web browser. The wrong attributes on the 'hosts'
>file can also make it unreadable (won't be found).
>
>You want to make sure that your client is using only a hostname and that
>the IP address for it is not still available in the DNS cache or in your
>client's cache.

No, I want to not access (for example) facebook at all, but preferably
without something waiting ages while it (fails to) do so. Whether this
is implemented by name, IP address, or a prayer to St. Jude is of no
interest to me.

Changing the hosts file - i. e. commenting out the line that translates
the unwanted name into 127.0.0.0 - changed the way the test page loaded,
_immediately_, as did changing it back: I changed the hosts file, then
clicked the reload button in the browser. With the hosts file commented
out, i. e. the remote site access allowed, the page loaded more or less
immediately; with it redirected to local, it paused for a minute or so.
I didn't need to flush any cache to witness the change in behaviour - in
both directions. (I think the "reload" button bypasses some caches.)
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Beauregard T. Shagnasty

unread,
Feb 8, 2013, 5:29:58 PM2/8/13
to
J. P. Gilliver (John) wrote:

> Beauregard T. Shagnasty writes:
>>I may have missed it; did you answer if other browsers hang up like Fx
>>when you have Facepuke in the hosts file?
>>
> No, I didn't, as I don't use any other browser.

Well, you could at least test with one. Gather more evidence. If other
browsers do the same slow load, it's not a Firefox problem. If they are
fast and unaffected by the hosts entry, then it may be Firefox.

Since you're the only person in the thread with the problem, it's gonna be
up to you. :-)

J. P. Gilliver (John)

unread,
Feb 8, 2013, 6:13:04 PM2/8/13
to
In message <b8SdnVdai6_74YjM...@mozilla.org>, Beauregard T.
An interesting hypothesis; _is_ no-one else experiencing this?
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

"Address the chair!" "There isn't a chair, there's only a rock!" "Well, call it
a chair!" "Why not call it a rock?" (First series, fit the sixth.)

VanguardLH

unread,
Feb 8, 2013, 11:13:29 PM2/8/13
to
I haven't perused all the other subthreads to check but ...

Have you yet disabled all your security software (anti-virus, firewall,
HIPS, whatever) and retested?

Christian Riechers

unread,
Feb 9, 2013, 2:09:28 AM2/9/13
to
On 02/09/2013 12:13 AM, J. P. Gilliver (John) wrote:
> In message <b8SdnVdai6_74YjM...@mozilla.org>, Beauregard T.
> Shagnasty <a.non...@example.invalid> writes:
>> J. P. Gilliver (John) wrote:
>>
>>> Beauregard T. Shagnasty writes:
>>>> I may have missed it; did you answer if other browsers hang up like Fx
>>>> when you have Facepuke in the hosts file?
>>>>
>>> No, I didn't, as I don't use any other browser.
>>
>> Well, you could at least test with one. Gather more evidence. If other
>> browsers do the same slow load, it's not a Firefox problem. If they are
>> fast and unaffected by the hosts entry, then it may be Firefox.
>>
>> Since you're the only person in the thread with the problem, it's
>> gonna be
>> up to you. :-)
>>
> An interesting hypothesis; _is_ no-one else experiencing this?

The site loads instantly here. I do use Ghostery to block web bugs.
There are 19 blocked on this site, Facebook Connect being only one of
them. What a mess.

--
Christian

Christoph Schmees

unread,
Feb 9, 2013, 6:24:10 AM2/9/13
to
On 08.02.2013 22:52, J. P. Gilliver (John) wrote:
> ...
> No, I want to not access (for example) facebook at all, but
> preferably without something waiting ages while it (fails to) do
> so. Whether this is implemented by name, IP address, or a prayer
> to St. Jude is of no interest to me.
> ...

+1

I for one use NoScript; in there I have denoted f.c.book as *not
trustworthy* - that kills all tracking scripts. Furthermore I
don't allow cookies from f.c.book. I empoy RequestPolicy and
don't allow f.c.book there either. That's enough for me.

David H. Lipman

unread,
Feb 9, 2013, 9:22:59 AM2/9/13
to
From: "J. P. Gilliver (John)" <G6...@soft255.demon.co.uk>

> I have some sites (including facebook and twitter) blocked in my hosts
> file.
>
> Some web pages have long pauses, with "waiting for" (or something
> similar - might be "waiting for response from") one of the blocked sites
> showing in the status line.
>
> Why is this? I thought putting something in the hosts file redirected
> things to the local computer, so there should be no delay. So presumably
> they're looking for some sort of script reply, or something?
>
> Any suggestions how to avoid these delays? (At worst, I presume I'd have
> to set up a web server on my machine?)


http://forums.malwarebytes.org/index.php?showtopic=122350

In short, Disabling the "DNS Client service" in a non-AD environment can
mitigate the slowness introduced by implementing a large etc/hosts file.


--
Dave
Multi-AV Scanning Tool - http://multi-av.thespykiller.co.uk
http://www.pctipp.ch/downloads/dl/35905.asp

J. P. Gilliver (John)

unread,
Feb 9, 2013, 6:49:11 PM2/9/13
to
In message <V7ydnUPBb_o3UYjM...@mozilla.org>, VanguardLH
No; when looking at sites with lots of script etc. on, I have no desire
to turn off such things.

I don't think that would explain why the page loads quickly when I _do_
allow access to the blocked site, either. (Surely if my security
software is slowing things when scripts etc. _can't_ be loaded, it would
make things run even more slowly when they can.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

`Where a calculator on the Eniac is equipped with 18,000 vacuum tubes and weighs
30 tons, computers in the future may have only 1,000 vacuum tubes and perhaps
weigh 1.5 tons.' Popular Mechanics, March 1949 (quoted in Computing 1999-12-16)

J. P. Gilliver (John)

unread,
Feb 9, 2013, 6:50:00 PM2/9/13
to
In message <h9ednc8519MoxovM...@mozilla.org>, David H.
Lipman <DLipman~nospam~@Verizon.Net> writes:
>From: "J. P. Gilliver (John)" <G6...@soft255.demon.co.uk>
>
>> I have some sites (including facebook and twitter) blocked in my
>>hosts file.
>>
>> Some web pages have long pauses, with "waiting for" (or something
>>similar - might be "waiting for response from") one of the blocked
>>sites showing in the status line.
>>
>> Why is this? I thought putting something in the hosts file redirected
>>things to the local computer, so there should be no delay. So
>>presumably they're looking for some sort of script reply, or something?
>>
>> Any suggestions how to avoid these delays? (At worst, I presume I'd
>>have to set up a web server on my machine?)
>
>
>http://forums.malwarebytes.org/index.php?showtopic=122350
>
>In short, Disabling the "DNS Client service" in a non-AD environment
>can mitigate the slowness introduced by implementing a large etc/hosts
>file.
>
>
Mine's only about 60 (active) lines.

VanguardLH

unread,
Feb 9, 2013, 7:37:01 PM2/9/13
to
"J. P. Gilliver (John)" wrote:

> In message <V7ydnUPBb_o3UYjM...@mozilla.org>, VanguardLH
> <V...@nguard.LH> writes:
>>I haven't perused all the other subthreads to check but ...
>>
>>Have you yet disabled all your security software (anti-virus, firewall,
>>HIPS, whatever) and retested?
>
> No; when looking at sites with lots of script etc. on, I have no desire
> to turn off such things.
>
> I don't think that would explain why the page loads quickly when I _do_
> allow access to the blocked site, either. (Surely if my security
> software is slowing things when scripts etc. _can't_ be loaded, it would
> make things run even more slowly when they can.

If you're that paranoid about one single well-known site causing some
horrendous damage to your computer setup then do the testing inside a
virtual machine, use disk state virtualization (i.e., Returnil), or save
a snapshot or image backup of the OS partition before testing.

I didn't say to turn off your security software and leave it off. I
said to turn it off when testing this ONE site.

David H. Lipman

unread,
Feb 9, 2013, 9:38:33 PM2/9/13
to
From: "J. P. Gilliver (John)" <G6...@soft255.demon.co.uk>

> In message <h9ednc8519MoxovM...@mozilla.org>, David H. Lipman
> <DLipman~nospam~@Verizon.Net> writes:
>> From: "J. P. Gilliver (John)" <G6...@soft255.demon.co.uk>
>>
>>> I have some sites (including facebook and twitter) blocked in my hosts
>>> file.
>>>
>>> Some web pages have long pauses, with "waiting for" (or something
>>> similar - might be "waiting for response from") one of the blocked sites
>>> showing in the status line.
>>>
>>> Why is this? I thought putting something in the hosts file redirected
>>> things to the local computer, so there should be no delay. So presumably
>>> they're looking for some sort of script reply, or something?
>>>
>>> Any suggestions how to avoid these delays? (At worst, I presume I'd have
>>> to set up a web server on my machine?)
>>
>> http://forums.malwarebytes.org/index.php?showtopic=122350
>>
>> In short, Disabling the "DNS Client service" in a non-AD environment can
>> mitigate the slowness introduced by implementing a large etc/hosts file.
>>
> Mine's only about 60 (active) lines.

OK. That's not rteally a large file.

Rob

unread,
Feb 10, 2013, 4:53:42 AM2/10/13
to
J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
> In message <V7ydnUPBb_o3UYjM...@mozilla.org>, VanguardLH
> <V...@nguard.LH> writes:
>>I haven't perused all the other subthreads to check but ...
>>
>>Have you yet disabled all your security software (anti-virus, firewall,
>>HIPS, whatever) and retested?
>
> No; when looking at sites with lots of script etc. on, I have no desire
> to turn off such things.
>
> I don't think that would explain why the page loads quickly when I _do_
> allow access to the blocked site, either. (Surely if my security
> software is slowing things when scripts etc. _can't_ be loaded, it would
> make things run even more slowly when they can.

I think it explains it very well.
You redirect many elements of the page you want to see to your local
computer. But you have software in place that somehow forbids those
connections, and thus the browser has no choice but to assume that these
connections respond slowly and wait a while.

The whole idea of putting local addresses in a hosts file relies on
the fact that the local computer does not have a webserver and the
connections to port 80 and 443 will immediately fail because a RST
is returned by the local computer when that connection is attempted.

It looks like your system does not work that way, and you are not
prepared to change that. Then, hosts file mapping to 127.0.0.1 is
not for you.

I think it is better to use Adblock and Ghostery to achieve what you
want, if needed with custom rules.
I never liked that whole "hostsfile for blocking sites" idea anyway,
it is quite limited because it is hostname-based and can easily be too
broad or too narrow.

J. P. Gilliver (John)

unread,
Feb 10, 2013, 6:43:09 AM2/10/13
to
In message <trCdnbG1GOQ4dovM...@mozilla.org>, VanguardLH
<V...@nguard.LH> writes:
>"J. P. Gilliver (John)" wrote:
>
>> In message <V7ydnUPBb_o3UYjM...@mozilla.org>, VanguardLH
>> <V...@nguard.LH> writes:
>>>I haven't perused all the other subthreads to check but ...
>>>
>>>Have you yet disabled all your security software (anti-virus, firewall,
>>>HIPS, whatever) and retested?
>>
>> No; when looking at sites with lots of script etc. on, I have no desire
>> to turn off such things.
>>
>> I don't think that would explain why the page loads quickly when I _do_
>> allow access to the blocked site, either. (Surely if my security
>> software is slowing things when scripts etc. _can't_ be loaded, it would
>> make things run even more slowly when they can.
>
>If you're that paranoid about one single well-known site causing some
>horrendous damage to your computer setup then do the testing inside a
>virtual machine, use disk state virtualization (i.e., Returnil), or save
>a snapshot or image backup of the OS partition before testing.

No, I'm not paranoid about damage: I have certain preferences (you can
call them paranoid if you like) about tracking. [To do all the above
would take far more time than is spent waiting for sites with links to
blocked sites to respond!]
>
>I didn't say to turn off your security software and leave it off. I
>said to turn it off when testing this ONE site.

I don't need to. Without any changes to my security software, I found
that the test page loaded _faster_ if I commented out one of the lines
in my hosts file, i. e. allowed the test page to load and run a script
(I think it was from fb - you can see what it was further back in this
thread). I suspect the same would apply to other pages which sit there
for ages saying "connecting to" a site that is blocked in my hosts file.
What I meant by the question that started this thread is: what is doing
the waiting, and why?
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Vini, Vidi, Velcro (I came, I saw, I stuck around).

J. P. Gilliver (John)

unread,
Feb 10, 2013, 6:54:34 AM2/10/13
to
In message <slrnkherh6...@xs8.xs4all.nl>, Rob
<nom...@example.com> writes:
>J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
[]
>> I don't think that would explain why the page loads quickly when I _do_
>> allow access to the blocked site, either. (Surely if my security
>> software is slowing things when scripts etc. _can't_ be loaded, it would
>> make things run even more slowly when they can.
>
>I think it explains it very well.
>You redirect many elements of the page you want to see to your local
>computer. But you have software in place that somehow forbids those
>connections, and thus the browser has no choice but to assume that these
>connections respond slowly and wait a while.
>
>The whole idea of putting local addresses in a hosts file relies on
>the fact that the local computer does not have a webserver and the
>connections to port 80 and 443 will immediately fail because a RST

Yes, immediately.

>is returned by the local computer when that connection is attempted.
>
>It looks like your system does not work that way, and you are not
>prepared to change that. Then, hosts file mapping to 127.0.0.1 is
>not for you.

I would be delighted to change things such that my system returned a
null response immediately when interrogated in that way, if that is what
you think is not happening.
>
>I think it is better to use Adblock and Ghostery to achieve what you
>want, if needed with custom rules.
>I never liked that whole "hostsfile for blocking sites" idea anyway,
>it is quite limited because it is hostname-based and can easily be too
>broad or too narrow.

I already use Adblock. And I would be interested in hearing your
suggestion for alternative ways to block sites.

Note that it is not the scripts as such that I want to block - well, I
do, but that's not the primary point: I don't want to be _accessing_
those sites at all. It is _tracking_ behaviour I am trying to control.
(And no, I don't visit lots of sites I don't want people knowing about;
it's just the _general_ tracking I object to. Call it paranoid if you
wish, but there are plenty of other people who share this view.) If you
have a suggestion of a way of not accessing sites (or, if you insist,
names) _other_ than use of the hosts file, I will be interested.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

Freedom of the press is limited to those who have one.

Rob

unread,
Feb 10, 2013, 7:40:42 AM2/10/13
to
J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
> In message <slrnkherh6...@xs8.xs4all.nl>, Rob
> <nom...@example.com> writes:
>>J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
> []
>>> I don't think that would explain why the page loads quickly when I _do_
>>> allow access to the blocked site, either. (Surely if my security
>>> software is slowing things when scripts etc. _can't_ be loaded, it would
>>> make things run even more slowly when they can.
>>
>>I think it explains it very well.
>>You redirect many elements of the page you want to see to your local
>>computer. But you have software in place that somehow forbids those
>>connections, and thus the browser has no choice but to assume that these
>>connections respond slowly and wait a while.
>>
>>The whole idea of putting local addresses in a hosts file relies on
>>the fact that the local computer does not have a webserver and the
>>connections to port 80 and 443 will immediately fail because a RST
>
> Yes, immediately.

I think you only tested port 80 and not 443 (https)

>>is returned by the local computer when that connection is attempted.
>>
>>It looks like your system does not work that way, and you are not
>>prepared to change that. Then, hosts file mapping to 127.0.0.1 is
>>not for you.
>
> I would be delighted to change things such that my system returned a
> null response immediately when interrogated in that way, if that is what
> you think is not happening.

You probably need to find out how your firewall works and what it
filters and what filtering actions are possible.

>>I think it is better to use Adblock and Ghostery to achieve what you
>>want, if needed with custom rules.
>>I never liked that whole "hostsfile for blocking sites" idea anyway,
>>it is quite limited because it is hostname-based and can easily be too
>>broad or too narrow.
>
> I already use Adblock. And I would be interested in hearing your
> suggestion for alternative ways to block sites.
>
> Note that it is not the scripts as such that I want to block - well, I
> do, but that's not the primary point: I don't want to be _accessing_
> those sites at all. It is _tracking_ behaviour I am trying to control.
> (And no, I don't visit lots of sites I don't want people knowing about;
> it's just the _general_ tracking I object to. Call it paranoid if you
> wish, but there are plenty of other people who share this view.) If you
> have a suggestion of a way of not accessing sites (or, if you insist,
> names) _other_ than use of the hosts file, I will be interested.

I think you should look at Ghostery.
It is focussing exactly on that topic. I use it as well (together
with Adblock+). It is also much easier to block/unblock all kinds of
trackers with a mouseclick, rather than by editing a system file after
you first examine page source to know what hostnames you need to block.

Beauregard T. Shagnasty

unread,
Feb 10, 2013, 8:04:16 AM2/10/13
to
Rob wrote:

> The whole idea of putting local addresses in a hosts file relies on the
> fact that the local computer does not have a webserver ...

My computer has a webserver and my browsers return immediately from trying
to access any hostname that happens to be in the hosts file.

Rob

unread,
Feb 10, 2013, 8:56:34 AM2/10/13
to
Beauregard T. Shagnasty <a.non...@example.invalid> wrote:
> Rob wrote:
>
>> The whole idea of putting local addresses in a hosts file relies on the
>> fact that the local computer does not have a webserver ...
>
> My computer has a webserver and my browsers return immediately from trying
> to access any hostname that happens to be in the hosts file.

When it has a real webserver that quickly returns an error that is OK too.
The problems start when the webserver is not a full function server but
some interface to broken software, or when there is a firewall that blocks
connections using something else than a RST reply.
(e.g. dropping all packets, sending ICMP unreachable, etc)

J. P. Gilliver (John)

unread,
Feb 10, 2013, 9:44:06 AM2/10/13
to
In message <slrnkhf9oi...@xs8.xs4all.nl>, Rob
What's puzzling me is how, when I block a site (name) using the hosts
file, _what_ is it that is waiting for a response? Surely the script, if
that's what it is, shouldn't be loaded in the first place?
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

"Grammar is there to help, not hinder."
-- Mark Wallace, APIHNA, 2nd December 2000 (quoted by John Flynn 2000-12-6)

J. P. Gilliver (John)

unread,
Feb 10, 2013, 9:59:29 AM2/10/13
to
In message <slrnkhf5aa...@xs8.xs4all.nl>, Rob
<nom...@example.com> writes:
>J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
>> In message <slrnkherh6...@xs8.xs4all.nl>, Rob
>> <nom...@example.com> writes:
>>>J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
>> []
>>>> I don't think that would explain why the page loads quickly when I _do_
>>>> allow access to the blocked site, either. (Surely if my security
>>>> software is slowing things when scripts etc. _can't_ be loaded, it would
>>>> make things run even more slowly when they can.
>>>
>>>I think it explains it very well.
>>>You redirect many elements of the page you want to see to your local
>>>computer. But you have software in place that somehow forbids those
>>>connections, and thus the browser has no choice but to assume that these
>>>connections respond slowly and wait a while.
>>>
>>>The whole idea of putting local addresses in a hosts file relies on
>>>the fact that the local computer does not have a webserver and the
>>>connections to port 80 and 443 will immediately fail because a RST
>>
>> Yes, immediately.
>
>I think you only tested port 80 and not 443 (https)

I have just tried https://127.0.0.1/ and .0/; the first came back (with
an error message) in about two seconds, the second near enough
immediately. Does that test 443?
>
>>>is returned by the local computer when that connection is attempted.
>>>
>>>It looks like your system does not work that way, and you are not
>>>prepared to change that. Then, hosts file mapping to 127.0.0.1 is
>>>not for you.
>>
>> I would be delighted to change things such that my system returned a
>> null response immediately when interrogated in that way, if that is what
>> you think is not happening.
>
>You probably need to find out how your firewall works and what it
>filters and what filtering actions are possible.

Each rule in it can have set, independently I think: Protocol (any, TCP,
UDP, both, ICMP, or other); Direction (in, out, or both); Port type
(any, single, range, or list); whether any application or only a
specified executable; Remote endpoint Address (any, single, mask, range,
custom group) and Port (any, single, range, or list); and time range (or
always). Finally it can be permit or deny for each such rule.

However, I still don't think the firewall is the reason for the delay,
because purely changing the line in the hosts file (commenting out the
blocking of one site) made the difference to how fast the page loaded;
no change was made to the firewall settings.
>
>>>I think it is better to use Adblock and Ghostery to achieve what you
>>>want, if needed with custom rules.
>>>I never liked that whole "hostsfile for blocking sites" idea anyway,
>>>it is quite limited because it is hostname-based and can easily be too
>>>broad or too narrow.
>>
>> I already use Adblock. And I would be interested in hearing your
>> suggestion for alternative ways to block sites.
>>
>> Note that it is not the scripts as such that I want to block - well, I
>> do, but that's not the primary point: I don't want to be _accessing_
>> those sites at all. It is _tracking_ behaviour I am trying to control.
>> (And no, I don't visit lots of sites I don't want people knowing about;
>> it's just the _general_ tracking I object to. Call it paranoid if you
>> wish, but there are plenty of other people who share this view.) If you
>> have a suggestion of a way of not accessing sites (or, if you insist,
>> names) _other_ than use of the hosts file, I will be interested.
>
>I think you should look at Ghostery.
>It is focussing exactly on that topic. I use it as well (together
>with Adblock+). It is also much easier to block/unblock all kinds of
>trackers with a mouseclick, rather than by editing a system file after
>you first examine page source to know what hostnames you need to block.

I think I might look at Ghostery. But I still don't see why what I am
doing makes pages hang for a long time: can you explain that?
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

> > Won't you come into the garden? I would like my roses to see you. -Richard

Rob

unread,
Feb 10, 2013, 1:32:06 PM2/10/13
to
J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
> In message <slrnkhf9oi...@xs8.xs4all.nl>, Rob
> <nom...@example.com> writes:
>>Beauregard T. Shagnasty <a.non...@example.invalid> wrote:
>>> Rob wrote:
>>>
>>>> The whole idea of putting local addresses in a hosts file relies on the
>>>> fact that the local computer does not have a webserver ...
>>>
>>> My computer has a webserver and my browsers return immediately from trying
>>> to access any hostname that happens to be in the hosts file.
>>
>>When it has a real webserver that quickly returns an error that is OK too.
>>The problems start when the webserver is not a full function server but
>>some interface to broken software, or when there is a firewall that blocks
>>connections using something else than a RST reply.
>>(e.g. dropping all packets, sending ICMP unreachable, etc)
>
> What's puzzling me is how, when I block a site (name) using the hosts
> file, _what_ is it that is waiting for a response? Surely the script, if
> that's what it is, shouldn't be loaded in the first place?

You don't block anything using the hosts file.
Using the hosts file you only redirect a certain host to your local
system. You shift the burden of providing the element from the real
host to your own system.

Rob

unread,
Feb 10, 2013, 1:36:08 PM2/10/13
to
J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
> In message <slrnkhf5aa...@xs8.xs4all.nl>, Rob
> <nom...@example.com> writes:
>>J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
>>> In message <slrnkherh6...@xs8.xs4all.nl>, Rob
>>> <nom...@example.com> writes:
>>>>J. P. Gilliver (John) <G6...@soft255.demon.co.uk> wrote:
>>> []
>>>>> I don't think that would explain why the page loads quickly when I _do_
>>>>> allow access to the blocked site, either. (Surely if my security
>>>>> software is slowing things when scripts etc. _can't_ be loaded, it would
>>>>> make things run even more slowly when they can.
>>>>
>>>>I think it explains it very well.
>>>>You redirect many elements of the page you want to see to your local
>>>>computer. But you have software in place that somehow forbids those
>>>>connections, and thus the browser has no choice but to assume that these
>>>>connections respond slowly and wait a while.
>>>>
>>>>The whole idea of putting local addresses in a hosts file relies on
>>>>the fact that the local computer does not have a webserver and the
>>>>connections to port 80 and 443 will immediately fail because a RST
>>>
>>> Yes, immediately.
>>
>>I think you only tested port 80 and not 443 (https)
>
> I have just tried https://127.0.0.1/ and .0/; the first came back (with
> an error message) in about two seconds, the second near enough
> immediately. Does that test 443?

Yes. But apparently it is not the same thing.

> However, I still don't think the firewall is the reason for the delay,
> because purely changing the line in the hosts file (commenting out the
> blocking of one site) made the difference to how fast the page loaded;
> no change was made to the firewall settings.

Hosts file entries are not blocking anything.

>>I think you should look at Ghostery.
>>It is focussing exactly on that topic. I use it as well (together
>>with Adblock+). It is also much easier to block/unblock all kinds of
>>trackers with a mouseclick, rather than by editing a system file after
>>you first examine page source to know what hostnames you need to block.
>
> I think I might look at Ghostery. But I still don't see why what I am
> doing makes pages hang for a long time: can you explain that?

You are redirecting hosts to 127.0.0.1 and apparently that does not
reply immediately. Just stop doing that and the problem will disappear.

Beauregard T. Shagnasty

unread,
Feb 10, 2013, 2:15:22 PM2/10/13
to
Rob wrote:

> You are redirecting hosts to 127.0.0.1 and apparently that does not
> reply immediately. Just stop doing that and the problem will disappear.

Yes, but the ads and/or malicious sites will *re*appear.

We know what the hosts file does and how it works...

VanguardLH

unread,
Feb 10, 2013, 4:32:11 PM2/10/13
to
Wouldn't a [software] firewall that stealths a host NOT reply at all?
If it replied than a hacker would know they found a host. You don't
want the hacker to ever know they reached the host and that means not
responding. So wouldn't the hacker/scanner have to wait at an IP under
test to see if something eventually was listening on a port there and
after some timeout then give up?

If so then it definitely seems security software is stealthing the host,
like a firewall, and cause a noticeable timeout delay rather than
immediate error. Going to 127.0.0.1 means trying to connect to a host
(which took 2 seconds for JP's setup to timeout) whereas 127.0.0.0
designates a network but your client cannot to a network (JP said that
errored instantly).

I'm also wondering what is the rest of the Javascript in the problematic
page(s). Blocking a script from some domain simply means it won't get
retrieved from there. That does NOT block or comment out all the code
that expects to use variables or functions that would've been defined in
the externally-referenced .js file. Calling a function in the web page
doesn't know that the function isn't defined so it probably causes a
delay to error (the web browser should be reporting all sorts of
"undefined" errors). If the coder were smart, they would retrieve the
external .js file but then test if something, like a variable, was
defined in the document that specified in the external .js file. I
don't see many pages doing that test (that they got the .js file). I
don't think the <script src=...> tag returns a status that can be tested
but there is probably a means of checking if the .js file existed or got
retrieve other than having to define file-specific variables to see they
show up in the document. The way pages are typically coded, they just
assume the external script will get retrieved okay, so they just go
ahead to try to use variables or functions that would've defined by that
retrieved .js file. The same for CSS files since I don't see any
testing that they were actually available or got retrieved.

VanguardLH

unread,
Feb 10, 2013, 4:46:36 PM2/10/13
to
"J. P. Gilliver (John)" wrote:

> What's puzzling me is how, when I block a site (name) using the hosts
> file, _what_ is it that is waiting for a response? Surely the script, if
> that's what it is, shouldn't be loaded in the first place?

See my other reply (to Rob). The blocked .js file doesn't get
retrieved. That only defines the values of variables and code within
functions. Those get used elsewhere in the document (web page). You're
*including* the .js file into the document. You aren't running it.
Other scripts code in the document is going to use the variables and
functions deposited upon retrieving the external .js file. Unless the
developer tests the .js got retrieved (and, if not, skips other script
code in the document) then all that script still gets executed -- and
errors.

http://www.google.com/search?q=javascript+test+if+.js+retrieved+included

There are many tricks to test if a .js or .css file got retrieved (to
include it in the current document). That doesn't mean the page(s) you
are loading do this. If not, they go off executing code that keeps
erroring (undefined var or function).

VanguardLH

unread,
Feb 10, 2013, 6:36:51 PM2/10/13
to
By the way, both NoScript and Ghostery provide surrogate scripts to
replace some of those it blocks. This may eliminate the delay for the
script code to error because the .js file was blocked from being
retrieved and included into the document.

http://purplebox.ghostery.com/?p=1016021974
http://purplebox.ghostery.com/?p=1016022253

I checked for NoScript, too. I looked in about:config and under the
following settings:

noscript.surrogate.ga.*

there were definitions to replace the Google Analytics script. So
either NoScript or Ghostery (or both as I have now) will try to
eliminate the delays in erroring code because the external .js file (to
include the vars and functions) got blocked.

I was using the Adblock Plus add-on for a few weeks. I grew weary of it
slowing the load of Firefox which is a known and often reported problem.
Below are some numbers with only NoScript installed, NoScript + Adblock
(no lists), NoScript + Adblock (2 lists), and NoScript + Ghostery:

Ablock Plus and Ghostery NOT installed:
firefox.exe memory = 77 KB
time to load Firefox = 2 sec

Adblock Plus installed (no subscribed blocklists):
firefox.exe memory = 120 KB
time to load firefox = 6 sec

Adblock Plus installed (EasyList+EasyPrivacy & Malware Domains
blocklists):
firefox.exe memory = 148 KB
time to load Firefox = 6 sec

Ghostery installed:
firefox.exe memory = 110 KB
time to load Firefox = 2 sec

Just installing Adblock Plus with no subscriptions results in slowing
the load of Firefox. Subscribing to blocklists almost doubles the
memory footprint of Firefox. More blocklists uses more memory. With
Ghostery, Firefox quickly snaps open and memory usage is smaller.

There have some users thinking the bigger -- MUCH bigger -- blocklists
for Adblock Plus means it will be far more effective than the smaller
list used by Ghostery. See:

http://cyberlaw.stanford.edu/node/6730

Yes, with the right combo of blocklists in Adblock Plus, it has the most
coverage (best blocking) -- but look at how little more it has over the
far smaller list used by Ghostery. Considering the irritating slowdown
to load Firefox and the jump in memory which increases with the addition
of blocklists, Adblock Plus was just too much a nuisance and too
resource hungry. I switched to Ghostery.

I don't like that Ghostery pops up a purple background dialog showing
what it blocked. I can see that *if* I choose by clicking on its status
bar icon. Most times I want it operating transparently. So if you get
Ghostery, go into about:config and change the following setting:

extensions.ghostery.showBubble
true = show the popup
false = do not show the popup

If you want the popup but want to shorten how long it stays onscreen:

extensions.ghostery.bubbleTimeout
15 (seconds) is the default

The popup will disappear before the timeout if you click on it. I
didn't want to bother clicking on it. I didn't want it in the way. The
same info can be had using clicking on its status bar icon.

J. P. Gilliver (John)

unread,
Feb 11, 2013, 12:17:11 AM2/11/13
to
In message <XI6dnVc6IPLPiYXM...@mozilla.org>, VanguardLH
<V...@nguard.LH> writes:
>"J. P. Gilliver (John)" wrote:
>
>> What's puzzling me is how, when I block a site (name) using the hosts

[For Rob's benefit: I am perfectly aware of how the hosts file works,
and know I am not blocking that site; I _am_ blocking my machine from
_accessing_ that site (or if you insist, redirecting it to itself when
it tries, which in layman's terms _is_ blocking).]

>> file, _what_ is it that is waiting for a response? Surely the script, if
>> that's what it is, shouldn't be loaded in the first place?
>
>See my other reply (to Rob). The blocked .js file doesn't get
>retrieved. That only defines the values of variables and code within
>functions. Those get used elsewhere in the document (web page). You're
>*including* the .js file into the document. You aren't running it.
>Other scripts code in the document is going to use the variables and
>functions deposited upon retrieving the external .js file. Unless the

Yes, that must be what's happening.

>developer tests the .js got retrieved (and, if not, skips other script
>code in the document) then all that script still gets executed -- and
>errors.
>
>http://www.google.com/search?q=javascript+test+if+.js+retrieved+included
>
>There are many tricks to test if a .js or .css file got retrieved (to
>include it in the current document). That doesn't mean the page(s) you
>are loading do this. If not, they go off executing code that keeps
>erroring (undefined var or function).

I guess so. (Though usually they _do_ load eventually, and without any
obvious brokenness; they just pause for a long time before doing so.)

I'll have to look into this Ghostery thing.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

live your dash. ... On your tombstone, there's the date you're born and the
date you die - and in between there's a dash. - a friend quoted by Dustin
Hoffman in Radio Times, 5-11 January 2013

Rob

unread,
Feb 11, 2013, 3:45:57 AM2/11/13
to
Beauregard T. Shagnasty <a.non...@example.invalid> wrote:
> Rob wrote:
>
>> You are redirecting hosts to 127.0.0.1 and apparently that does not
>> reply immediately. Just stop doing that and the problem will disappear.
>
> Yes, but the ads and/or malicious sites will *re*appear.

Of course you install a different mechanism to achieve what you
want, e.g. Adblock+ and/or Ghostery.

> We know what the hosts file does and how it works...

Then you are able to debug any issues that occur.

Rob

unread,
Feb 11, 2013, 3:51:19 AM2/11/13
to
VanguardLH <V...@nguard.LH> wrote:
> "Rob" wrote:
>
>> Beauregard T. Shagnasty <a.non...@example.invalid> wrote:
>>> Rob wrote:
>>>
>>>> The whole idea of putting local addresses in a hosts file relies on the
>>>> fact that the local computer does not have a webserver ...
>>>
>>> My computer has a webserver and my browsers return immediately from trying
>>> to access any hostname that happens to be in the hosts file.
>>
>> When it has a real webserver that quickly returns an error that is OK too.
>> The problems start when the webserver is not a full function server but
>> some interface to broken software, or when there is a firewall that blocks
>> connections using something else than a RST reply.
>> (e.g. dropping all packets, sending ICMP unreachable, etc)
>
> Wouldn't a [software] firewall that stealths a host NOT reply at all?
> If it replied than a hacker would know they found a host. You don't
> want the hacker to ever know they reached the host and that means not
> responding. So wouldn't the hacker/scanner have to wait at an IP under
> test to see if something eventually was listening on a port there and
> after some timeout then give up?

Probably. I am not in that camp. I don't believe in that whole
stealthing thing. When my host sends a reply to a hacker, but filters
all the hacker's packets, there is nothing the hacker can do that he
could not do if the host was "stealth".

I send RST replies on some TCP SYNs and I think it is good. It avoids
problems similar to the topic of this discussion.

Beauregard T. Shagnasty

unread,
Feb 11, 2013, 7:32:26 AM2/11/13
to
Rob wrote:
Yes, but I am not the one with the issues.

VanguardLH

unread,
Feb 11, 2013, 2:25:36 PM2/11/13
to
"Rob" wrote:

> VanguardLH <V...@nguard.LH> wrote:
>> "Rob" wrote:
>>
>>> Beauregard T. Shagnasty <a.non...@example.invalid> wrote:
>>>> Rob wrote:
>>>>
>>>>> The whole idea of putting local addresses in a hosts file relies on the
>>>>> fact that the local computer does not have a webserver ...
>>>>
>>>> My computer has a webserver and my browsers return immediately from trying
>>>> to access any hostname that happens to be in the hosts file.
>>>
>>> When it has a real webserver that quickly returns an error that is OK too.
>>> The problems start when the webserver is not a full function server but
>>> some interface to broken software, or when there is a firewall that blocks
>>> connections using something else than a RST reply.
>>> (e.g. dropping all packets, sending ICMP unreachable, etc)
>>
>> Wouldn't a [software] firewall that stealths a host NOT reply at all?
>> If it replied than a hacker would know they found a host. You don't
>> want the hacker to ever know they reached the host and that means not
>> responding. So wouldn't the hacker/scanner have to wait at an IP under
>> test to see if something eventually was listening on a port there and
>> after some timeout then give up?
>
> Probably. I am not in that camp. I don't believe in that whole
> stealthing thing.

Perhaps you don't believe you need to hide your host because you've need
had a hacker using a scanner and then slam your host with connect
requests that literally blocks it from doing anything on the network.
Hope that works out for you.

Just because you don't believe in stealthing your host doesn't mean
whatever security software you have installed on your host has the same
opinion.

> When my host sends a reply to a hacker, but filters
> all the hacker's packets, there is nothing the hacker can do that he
> could not do if the host was "stealth".

Wrong. The hacker can drown your host's NIC interface with connection
requests. What do you think a DOS attack is? You're going to waste all
that processing time and network resources to establish connections only
to deny them?

> I send RST replies on some TCP SYNs and I think it is good. It avoids
> problems similar to the topic of this discussion.

And you'll keep sending, resending, and resending, and resending.

Again, that's your choice. But is it the choice of your security
software? You might be looking at just your firewall for its
configuration regarding stealthing but some anti-virus products also
have some firewalling features. So far, and from your replies, you
haven't identified that you have ANY security software on your host or
that you checked if it is stealthing or not.

VanguardLH

unread,
Feb 11, 2013, 2:26:58 PM2/11/13
to
Oops, through it was JP to whom I was responding. That last bit was
really targeting JP since he doesn't know (because he doesn't declare he
checked or knows) how his security software, if any, is configured.

Rob

unread,
Feb 11, 2013, 3:00:36 PM2/11/13
to
VanguardLH <V...@nguard.LH> wrote:
>> Probably. I am not in that camp. I don't believe in that whole
>> stealthing thing.
>
> Perhaps you don't believe you need to hide your host because you've need
> had a hacker using a scanner and then slam your host with connect
> requests that literally blocks it from doing anything on the network.
> Hope that works out for you.
>
> Just because you don't believe in stealthing your host doesn't mean
> whatever security software you have installed on your host has the same
> opinion.

There is no relation between "hide your host" amd "use stealth" and
an attackets possibility of sending connect requests to you.

>> When my host sends a reply to a hacker, but filters
>> all the hacker's packets, there is nothing the hacker can do that he
>> could not do if the host was "stealth".
>
> Wrong. The hacker can drown your host's NIC interface with connection
> requests. What do you think a DOS attack is? You're going to waste all
> that processing time and network resources to establish connections only
> to deny them?

They can do that anyway. There is nothing you can locally do to
prevent that.

This whole stealth movement has been motivated by a single instance
of a network virus where it first sent a ping and then proceeded to
do further attacks only when that ping was replied to. This is long
ago and things have changed a lot since then.
I know there is a strongvoiced man on internet who has a different
opinion, but I think he is wrong.

>> I send RST replies on some TCP SYNs and I think it is good. It avoids
>> problems similar to the topic of this discussion.
>
> And you'll keep sending, resending, and resending, and resending.

No, there is only a single RST reply to each incoming SYN, and when I
like I can limit the rate even more.

> Again, that's your choice. But is it the choice of your security
> software? You might be looking at just your firewall for its
> configuration regarding stealthing but some anti-virus products also
> have some firewalling features. So far, and from your replies, you
> haven't identified that you have ANY security software on your host or
> that you checked if it is stealthing or not.

I am running Linux and I have no such intermixed software product on
my system. My firewall is what it is: a firewall. My virus scanner
is only used to identify possible malware, mostly targeting other systems.

J. P. Gilliver (John)

unread,
Feb 11, 2013, 5:08:52 PM2/11/13
to
In message <slrnkhijf4...@xs8.xs4all.nl>, Rob
<nom...@example.com> writes:
[]
>There is no relation between "hide your host" amd "use stealth" and
>an attackets possibility of sending connect requests to you.
[]
>They can do that anyway. There is nothing you can locally do to
>prevent that.
>
>This whole stealth movement has been motivated by a single instance
[]
>I am running Linux and I have no such intermixed software product on
>my system. My firewall is what it is: a firewall. My virus scanner
>is only used to identify possible malware, mostly targeting other systems.

My original query has got a bit lost. Understandably: I didn't
_originally_ explain _why_ I was using the host file, which was nothing
to do with blocking scripts and hacker attacks: I just wanted to stop
tracking, by stopping my machine trying to _access_ certain sites
(names). [Though blocking scripts was I thought a useful side-effect.]

The best explanation for the delays that I have seen in this thread so
far is that the code on the page containing the links to the sites I
don't want to visit is expecting some sort of return parameters from
code on those sites. I am still a little puzzled, though, as (a) the
pages usually do load eventually, without showing any sign of being
"broken" due to lack of such return parameters, and (b) no-one else has
admitted having these symptoms. Though (b) could be explained by no-one
else doing it the way I do, perhaps using Ghostery instead.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf

VanguardLH

unread,
Feb 11, 2013, 10:51:28 PM2/11/13
to
In Internet Explorer and for sites where I have blocked scripts coming
from off-domain sites (i.e., they're coming from elsewhere), the visited
page will load BUT it will incur errors which I see with the error icon
at the left end of the status bar at the bottom of IE's window, like the
following:

Message: 'content' is null or not an object
Message: Object doesn't support this property or method

The first error is due to some variable or object that was defined in
the blocked script. Since it got blocked, it didn't get included in the
code for the current page. Then something tries to use the variable or
object but it isn't defined. It was in the external file (for the
script) that got blocked. The second error is when code in the visited
page (included in the page and not retrieved from elsewhere to include
in that page) tries to call a function that was defined in the blocked
script file. Since the function never arrived to get included with the
rest of the page's code, the page's code errors because it can't find
the method (function). It's not there because the source for the
function's definition got blocked.

Firefox doesn't show you a handy and immediately available error status
icon on which you can click to see the errors that occurred when
rendering the page content or running scripts in that page. Instead you
have to open Firefox's error console (Ctrl+Shift+J). Then you'll see
all those errors that were caused because you chose to block active
content (scripts) from showing up. They were needed to define the vars,
objects, or methods that the rest of the page's code expects to be
available.

In IE, the only blocking that gets done there is from using its
InPrivate Filter and an XML that I wrote to import that has IE block
content. However, unlike other filtering programs or add-ons used by
other web browsers, IE does NOT block that content from the 1st-party
domain. That is, for example, if I block on *.doubleclick.com* then IE
will block content in a web page that comes from that domain as 3rd
party content. That does not preclude me from visiting doubleclick.com
where its pages are 1st party content. The idea is to block unwanted
content from ELSEWHERE than where I visit. I do also block some domains
or URL strings in my anti-virus program (Avast Free) which will prevent
any web-centric application from getting to those domains. That same
blacklist in my AV program (through its web content malware/filtering
option) will affect Firefox, too. In Firefox, in addition to the AV's
URL blacklist, I have both NoScript and Ghostery blocking some stuff. I
configured NoScript to allow scripts in the domain that I visit. Well,
I'm choosing to go there and I'll end up enabling their scripts
(probably whitelisting them) because I do want what they have to show.
I use the Javascript Switch add-on to disable all Javascript before
visiting an unknown or suspect site. So NoScript will, by default in my
configuration, block 3rd party scripts, like Google Analytics, unless I
allow them (temporarily or via whitelisting). Ghostery has a lots of
its own blocks. So in Firefox's error console, I see lots more error
than IE shows but I expect that to be the case. I'm doing a lot more
filtering in Firefox (now my primary web browser) than in IE (still
needed for Windows Update and as a second opinion web browser).

Because you haven't looked in Firefox's error console is why you don't
realize all those in-page scripts are generating all sorts of errors
trying to use variables, objects, or methods that would've been defined
if you had not blocked those script files from getting retrieved.

The page's code will still executes despite all that blocking. Unless
the developer specifically tested whether or not the external script
file got retrieved, which is typical, the rest of the page's code simply
expects that the script file got retrieved so all those variables,
objects, and methods would be available. The page code merrily proceeds
to execute but errors each time it tries to use something that should've
been included if the blocked script file had not been blocked.

You might be blocking to supposedly speed up the rendering of web pages.
That's doable if you are blocking *content* and not scripts. You might
be blocking scripts because you don't want to get tracked or you feel
some scripts exhibit "bad" behavior. That will result in the rest of
the in-page script generating errors. If there is a lot of in-page code
then there will be lots of errors.
0 new messages