Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

How to block Google Groups

3 views
Skip to first unread message

default

unread,
Mar 15, 2008, 1:52:38 PM3/15/08
to

This is the draconian solution and blocks everything from Google - I
plan to fine tune it a bit as time allows. This eliminates all google
groupers and about 90% of the spam.

Download and extract to a directory: newsproxy

Download link:
http://www.pricelesswarehome.org/ftp/Programs/nps-124.zip

This proxy program needs to be inserted between your news reader and
your Internet connection. To do that, change your newsreader's server
to "localhost" minus the quotes. Localhost is your own computer.

Run the executable newsproxy.exe from the directory where you
extracted it to, and configure it to connect to your news server (the
same one that got replaced by localhost). If you have a firewall, it
needs to be allowed to connect to the net. (Read the readme files)

When you first run the program it will look for nfilter.dat There is
no nfilter dat in the stand alone version (the windows installed
version has the file with examples on how to use it). When you create
your first filter it asks if you want to create the filter file - go
for it.

At this point, if everything is working, your newsreader should be
able to connect and download headers like it always did, and you
shouldn't have problems posting.

You need to add a filter to tell newsproxy that Google is off limits.
The syntax for the filter is: * drop Path:*google*

This will eliminate google groups from all newsgroups - it can be
tweaked to eliminate it from selected groups and can also be tweaked
to eliminate just groupers who cross post (read the faq for that)

That's pretty much it . . . no more google groups and 90% of the spam
evaporates too.

The filter itself is a regular text file (you can make it using
notepad) and call it: nfilter.dat The line * drop Path:*google*
needs to be in the file. OR you can configure newsproxy to go to a
different file for its filters so you can configure several filter
files and use them for certain tasks.

The version of newsproxy that comes with a window installer is at:
http://www.nfilter.org/np-120.exe It comes with a dummy nfilter.dat
file pre installed that gives hints on the correct syntax for filters.

http://www.nfilter.org/faq.html#2.1 an FAQ on the operation and
setup. Some folks have also posted their own nfilter.dat files. You
can filter on virtually anything that appears in the full header for
newsgroup posts. The filter can be global or just for a specific
group. A filter can be installed to filter out cross posters, or
cross posts that exceed a given number.

The program may seem like it isn't doing anything or you may see your
newsreader saying it is downloading 20 new headers and only 9 new ones
appear. Choose View, Dropped Articles, and you see a list of the spam
you are missing in your reader, as well as which filter caught it.

I love this thing, but am interested in hearing others experience and
perhaps some hints on better filtering techniques.
--


----== Posted via Pronews.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.pronews.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= - Total Privacy via Encryption =---

Rich Webb

unread,
Mar 15, 2008, 12:59:38 PM3/15/08
to
On Sat, 15 Mar 2008 12:52:38 -0500, default <def...@defaulter.net>
wrote:
[snip...snip...]

>I love this thing, but am interested in hearing others experience and
>perhaps some hints on better filtering techniques.

I installed Hamster http://www.elbiah.de/hamster/pg/ for much the same
reason. Can't do an A-B comparison of the two since I've only used the
one but it does what I want in terms of blocking googlegroups plus
allowing selected posters through that service.

--
Rich Webb Norfolk, VA

default

unread,
Mar 15, 2008, 2:23:17 PM3/15/08
to

I need to play with it some more . . . filtering out too much right
now. I go to the filter log and look at message ID's, then search for
the message using google groups search and find stuff I would rather
let through.

How did you find the setup for hamster to be? The language put me off
when I was researching for a proxy that would filter Google.

Martin Griffith

unread,
Mar 15, 2008, 1:21:13 PM3/15/08
to

JeffM

unread,
Mar 15, 2008, 2:01:12 PM3/15/08
to
Martin Griffith wrote:
>some info
>http://www.theregister.co.uk/2008/03/14/captcha_serfs/

Yup. That's a viable theory. Once you automate it, however,
the cost goes to a penny a day for the same results.

Whatever the *mechanism*,
Paul Hovnanian has given the only **solution** I have seen:
http://groups.google.com/group/sci.electronics.design/browse_frm/thread/98cece48cecfa906/767c229628fc90bd?q=What-Google-needs+*-*-*-*-*-*-*-refunds+zz-zz+*-CAPTCHA-*-*-*-*-*-*-*-*-cracked+advertisers
news:47DAC432...@seanet.com

Hint: It's a bottom-line issue with Google;
if you aren't paying Google for advertising, YOU don't matter.
(Has Google opened a branch office in Redmond?)
http://www.google.com/search?q=Don't-be-evil

Martin Griffith

unread,
Mar 15, 2008, 2:40:39 PM3/15/08
to

OT can you find a worse site than this?
http://havenworks.com/


martin

JeffM

unread,
Mar 15, 2008, 3:07:39 PM3/15/08
to
Martin Griffith wrote:
>OT can you find a worse site than this?
>http://havenworks.com/

Its code *is* a mess
http://validator.w3.org/check?uri=http://havenworks.com/
Failed validation, 2885 Errors

...but it's not even close to my record for non-compliance:
http://validator.w3.org/check?uri=http://www.oreilly.com/catalog/winxpnut2/inx.html
Failed validation, 7341 Errors

It does edge out what was in the #2 spot on my crap list:
http://validator.w3.org/check?uri=http://imdb.com/tvgrid/2006-11-14/2000
Failed validation, 2843 Errors

Martin Griffith

unread,
Mar 15, 2008, 3:49:50 PM3/15/08
to
On Sat, 15 Mar 2008 12:07:39 -0700 (PDT), in sci.electronics.design
JeffM <jef...@email.com> wrote:

>Martin Griffith wrote:
>>OT can you find a worse site than this?
>>http://havenworks.com/
>

>...but it's not even close to my record for non-compliance:


and O'reilly publishes books on......now let me see......

I occasionally wonder, usually after a visit to validator, how on
earth it's possible to write a web browser that doesn't hang every 5
minutes


martin

John Fields

unread,
Mar 15, 2008, 3:53:27 PM3/15/08
to
On Sat, 15 Mar 2008 19:40:39 +0100, Martin Griffith
<mart_in...@yah00.es> wrote:


>OT can you find a worse site than this?
>http://havenworks.com/

---
Probably not. ;)

This from near the end of it:

<QUOTE>
!-! Nominated for Most Poorly Designed Website in the World by
Digg.com
and "El peor diseño del mundo" by elMundo.es
--------------------------------------------------------------------------------
Website Web Design: Web Design by Hermit ;-)
--------------------------------------------------------------------------------
<END QUOTE>


--
JF

Martin Griffith

unread,
Mar 15, 2008, 4:13:21 PM3/15/08
to

LOl, never got that far..


martin

Paul Hovnanian P.E.

unread,
Mar 15, 2008, 7:10:23 PM3/15/08
to
JeffM wrote:
>
> Martin Griffith wrote:
> >some info
> >http://www.theregister.co.uk/2008/03/14/captcha_serfs/
>
> Yup. That's a viable theory. Once you automate it, however,
> the cost goes to a penny a day for the same results.
>
> Whatever the *mechanism*,
> Paul Hovnanian has given the only **solution** I have seen:
> http://groups.google.com/group/sci.electronics.design/browse_frm/thread/98cece48cecfa906/767c229628fc90bd?q=What-Google-needs+*-*-*-*-*-*-*-refunds+zz-zz+*-CAPTCHA-*-*-*-*-*-*-*-*-cracked+advertisers
> news:47DAC432...@seanet.com

You're going to have to be more careful, quoting that nut-case. But the
truth of the matter is; this solution might be the only way to stop open
Usenet services that hand out accounts to abusive users. It might not
just be a financial thing, but a question of reputation. Become known as
the home for riff-raff and your better users will abandon the service.
It becomes a self-fulfilling prophecy.


> Hint: It's a bottom-line issue with Google;
> if you aren't paying Google for advertising, YOU don't matter.
> (Has Google opened a branch office in Redmond?)
> http://www.google.com/search?q=Don't-be-evil

--
Paul Hovnanian mailto:Pa...@Hovnanian.com
------------------------------------------------------------------
Plaese porrf raed befre postng.

Jim Thompson

unread,
Mar 15, 2008, 6:18:09 PM3/15/08
to
On Sat, 15 Mar 2008 12:52:38 -0500, default <def...@defaulter.net>
wrote:

[snip]


>
>You need to add a filter to tell newsproxy that Google is off limits.
>The syntax for the filter is: * drop Path:*google*
>

[snip]

What is the syntax for dropping Google, but with exceptions?

...Jim Thompson
--
| James E.Thompson, P.E. | mens |
| Analog Innovations, Inc. | et |
| Analog/Mixed-Signal ASIC's and Discrete Systems | manus |
| Phoenix, Arizona Voice:(480)460-2350 | |
| E-mail Address at Website Fax:(480)460-2142 | Brass Rat |
| http://www.analog-innovations.com | 1962 |

America: Land of the Free, Because of the Brave

JeffM

unread,
Mar 15, 2008, 7:57:25 PM3/15/08
to
default wrote:
>>You need to add a filter to tell newsproxy that Google is off limits.
>>The syntax for the filter is: * drop Path:*google*
>>
Jim Thompson wrote:
>What is the syntax for dropping Google, but with exceptions?

(alt.binaries.schematics.electronic removed from Groups line.)

ISTM that you could get 99.99% of the current rash with a Boolean AND
of
From==*gmail* AND Message-ID==*googlegroups*

Possible? Notable exceptions?

Jim Thompson

unread,
Mar 15, 2008, 8:09:23 PM3/15/08
to
On Sat, 15 Mar 2008 16:57:25 -0700 (PDT), JeffM <jef...@email.com>
wrote:

Hmmmmm :-) You may be right.

Presently I just blanket plonk @gmail or @hotmail.

I'll do away with my present Agent-contained filters, watch headers
and keep data ;-)

Message has been deleted

Rich Webb

unread,
Mar 15, 2008, 3:20:42 PM3/15/08
to
On Sat, 15 Mar 2008 13:23:17 -0500, default <def...@defaulter.net>
wrote:

>
>On Sat, 15 Mar 2008 12:59:38 -0400, Rich Webb
><bbe...@mapson.nozirev.ten> wrote:
>
>>On Sat, 15 Mar 2008 12:52:38 -0500, default <def...@defaulter.net>
>>wrote:
>>[snip...snip...]
>>>I love this thing, but am interested in hearing others experience and
>>>perhaps some hints on better filtering techniques.
>>
>>I installed Hamster http://www.elbiah.de/hamster/pg/ for much the same
>>reason. Can't do an A-B comparison of the two since I've only used the
>>one but it does what I want in terms of blocking googlegroups plus
>>allowing selected posters through that service.
>
>I need to play with it some more . . . filtering out too much right
>now. I go to the filter log and look at message ID's, then search for
>the message using google groups search and find stuff I would rather
>let through.
>
>How did you find the setup for hamster to be? The language put me off
>when I was researching for a proxy that would filter Google.

Not too bad; got it all working over a lunch break.

Now that I've got a fair record in the killfile log, I'll probably use
a bit of Perl to parse it for additional IDs that I may want to
whitelist. On the whole, though, I've been pretty happy with it.

IanM

unread,
Mar 16, 2008, 8:33:26 AM3/16/08
to

"default" <def...@defaulter.net> wrote in message
news:1205599143_135@isp.n...

Hi,

my newsserver (Astraweb) doesnt supply Path in the headers so I had to use

* drop Message-ID:*google*

you can telnet to your newsserver and "ask" it what it supplies as head info
using the list overview.fmt command.

list overview.fmt
215 Order of fields in overview database.
Subject:
From:
Date:
Message-ID:
References:
Bytes:
Lines:
Xref:full

--

IanM

default

unread,
Mar 16, 2008, 9:41:04 AM3/16/08
to

On Sat, 15 Mar 2008 15:18:09 -0700, Jim Thompson
<To-Email-Use-Th...@My-Web-Site.com> wrote:

>What is the syntax for dropping Google, but with exceptions?

I don't think it allows exceptions, you either add a filter for each
poster or whole domains or IP addresses.

Maybe Hamster can do that - my initial take on hamster is that it has
a steeper learning curve.

IanM

unread,
Mar 16, 2008, 9:35:18 AM3/16/08
to

"Jim Thompson" <To-Email-Use-Th...@My-Web-Site.com> wrote in
message news:lmiot35oc7hnmob3s...@4ax.com...


> On Sat, 15 Mar 2008 12:52:38 -0500, default <def...@defaulter.net>
> wrote:
>
> [snip]
>>
>>You need to add a filter to tell newsproxy that Google is off limits.
>>The syntax for the filter is: * drop Path:*google*
>>
> [snip]
>
> What is the syntax for dropping Google, but with exceptions?
>
> ...Jim Thompson

Jim,

is this the sort of thing you want?

* score:-100 From:*insertnamehere*
* score:+10 Message-ID:*google*
* FLAG:KILL-FILE score:10

see faq

http://www.nfilter.org/faq.html#3.8

--

IanM


Rich Webb

unread,
Mar 16, 2008, 10:49:19 AM3/16/08
to
On Sun, 16 Mar 2008 08:41:04 -0500, default <def...@defaulter.net>
wrote:

>
>On Sat, 15 Mar 2008 15:18:09 -0700, Jim Thompson
><To-Email-Use-Th...@My-Web-Site.com> wrote:
>
>>What is the syntax for dropping Google, but with exceptions?
>
>I don't think it allows exceptions, you either add a filter for each
>poster or whole domains or IP addresses.
>
>Maybe Hamster can do that - my initial take on hamster is that it has
>a steeper learning curve.

The Hamster scoring scheme allows either a direct assignment or an
increment, so I can say "=-9999 Message-ID googlegroups.com" to assign
a value or "-100 Message-ID googlegroups.com" and "+100 From
<user.name.here>" as increments. At the end of the process, after all
rules have been applied, messages with a score >= 0 will be loaded.

The post-load fields are also accessible (after loading into the
Hamster server, natch) so the NNTP-Posting-Host and Injection-Info
header fields can also be checked and their scores added to the total,
which catches google-groupers impersonating others who may post
through that service.

Jim Thompson

unread,
Mar 16, 2008, 11:19:27 AM3/16/08
to
On Sun, 16 Mar 2008 13:35:18 -0000, "IanM" <nobody@no_where.co.uk>
wrote:

Aha! Even _I_ can manage that ;-)

I take it that "Regular Expressions" must be activated?

default

unread,
Mar 16, 2008, 2:58:08 PM3/16/08
to

Neat. Thanks.

Tim Williams

unread,
Mar 16, 2008, 3:57:31 PM3/16/08
to
For those OE users, you can:
Go to: Tools / Message Rules / News...
Start a new rule, where the "From:" line contains "@gmail.com" (or any
other options you'd like, sadly OE doesn't include most header fields),
mark or delete the message.

I have it set to mark as Ignored, and I have a setting "Hide Ignored
Messages", which is set under the View / Current View menu. At the moment,
it appears I have 4436 messages total (through February), 3672 hiding
ignored messages, some of which are threads I've ignored (Message / Ignore
Conversation).

Tim

--
Deep Fryer: A very philosophical monk.
Website @ http://webpages.charter.net/dawill/tmoranwms

"default" <def...@defaulter.net> wrote in message
news:1205599143_135@isp.n...
>

> This is the draconian solution and blocks everything from Google - I
> plan to fine tune it a bit as time allows. This eliminates all google
> groupers and about 90% of the spam.
>
> Download and extract to a directory: newsproxy

...


Message has been deleted

MooseFET

unread,
Mar 18, 2008, 9:22:35 AM3/18/08
to

To see if google does anything about it, I just reported a whole bunch
of them.

It will be interesting to see if the reported profiles go away.

Michael A. Terrell

unread,
Mar 18, 2008, 2:46:37 PM3/18/08
to


I have this nice bridge for sale in Brooklyn. Buy it today, and I'll
throw in some nice oceanfront property in Phoenix.


--
aioe.org is home to cowards and terrorists

Add this line to your news proxy nfilter.dat file
* drop Path:*aioe.org!not-for-mail to drop all aioe.org traffic.

http://improve-usenet.org/index.html

JeffM

unread,
Mar 18, 2008, 3:18:06 PM3/18/08
to

Their anti-spam process appears to be FULLY automated.
It CAN work, but it requires MANY reports to reach critical mass.

It will take a complete re-think of Google's methodology
before we see a *big* difference on that front.

Now that their poorly-designed front gates have been breached,
in light of the non-linear increase, I doubt many will bother to
report.
I used to add a report to Google while I was reporting the jerks;
I've just given up on that.

In some cases, I *have* seen reports to ISPs have an effect.
Currently, *that* is where any effort should go IMO.

Message has been deleted

Rich Grise

unread,
Mar 19, 2008, 9:16:03 PM3/19/08
to
On Tue, 18 Mar 2008 19:42:15 -0700, Do I really need to say? wrote:
> On Tue, 18 Mar 2008 14:46:37 -0400, "Michael A. Terrell"

>>MooseFET wrote:
>>>
>>> To see if google does anything about it, I just reported a whole bunch
>>> of them.
>>>
>>> It will be interesting to see if the reported profiles go away.
>>
>> I have this nice bridge for sale in Brooklyn. Buy it today, and I'll
>>throw in some nice oceanfront property in Phoenix.
>
> Jeez!
>
> Got any material that is older or even less humorous?

No, but I've got this:

A blonde's car gets a flat tire on I-80 westbound
during rush hour. She eases her car onto the shoulder
of the road. She carefully steps out of the car and
opens the trunk. She takes out two cardboard men,
unfolds them, and stands them at the rear of the
vehicle facing oncoming traffic. The lifelike
cardboard men are in trench coats, exposing their nude
bodies to approaching drivers. Not surprisingly,
traffic becomes snarled and backed up.

It's not long before a state trooper's car arrives.
The trooper, clearly agitated, approaches the blonde
yelling, "What is going on here?"

"My car broke down, Officer" says the woman, calmly.

"Well, what the hell are these obscene cardboard
pictures doing here by the road?!" asks the Officer.

"Helllllooooo ... those are my emergency flashers!"

Cheers!
Rich

Message has been deleted

JosephKK

unread,
Mar 22, 2008, 2:02:37 PM3/22/08
to
On Sat, 15 Mar 2008 12:07:39 -0700 (PDT), JeffM <jef...@email.com>
wrote:

>Martin Griffith wrote:

Aside from the validation errors, it is very badly designed. Like
some of the semiconductor manufacturers sites.

JosephKK

unread,
Mar 22, 2008, 2:21:21 PM3/22/08
to
On Tue, 18 Mar 2008 12:18:06 -0700 (PDT), JeffM <jef...@email.com>
wrote:

>MooseFET wrote:

That sounds like we need to ask someone (Jan?) to write a program to
automate reporting. We could then slam their mailboxes with
legitimate reports, they would almost have to act, though not
necessarily appropriately.

Guy Macon

unread,
Mar 22, 2008, 3:11:51 PM3/22/08
to

JosephKK wrote:

The oreilly.com site really has one big error that manufactures
most of the others: a DOCTYPE of XHTML 1.0 Transitional. It only
has 145 errors when validated against as HTML 4.01 Transitional.

--
Guy Macon
<http://www.guymacon.com/>


JeffM

unread,
Mar 22, 2008, 3:51:14 PM3/22/08
to
>>>Martin Griffith wrote:
>>>>OT can you find a worse site than this?
>>>>http://havenworks.com/
>>>>
>>JeffM wrote:
>>>...but it's not even close to my record for non-compliance:
>>>http://validator.w3.org/check?uri=http://www.oreilly.com/catalog/winxpnut2/inx.html
>>>Failed validation, 7341 Errors
>>>
Martin Griffith wrote:
:and O'reilly publishes books on......now let me see......
:
Yup. Exactly.

Guy Macon wrote:
>The oreilly.com site really has one big error that manufactures
>most of the others: a DOCTYPE of XHTML 1.0 Transitional. It only
>has 145 errors when validated against as HTML 4.01 Transitional.

WRT Martin's observation (which I hoped would be noted),
you'd think *this* company would have a clue
about standards compliance and validation.
It's obvious they made zero effort on that front. Just pitiful.

Message has been deleted

Michael A. Terrell

unread,
Mar 22, 2008, 5:51:14 PM3/22/08
to

JosephKK wrote:
>
> That sounds like we need to ask someone (Jan?) to write a program to
> automate reporting. We could then slam their mailboxes with
> legitimate reports, they would almost have to act, though not
> necessarily appropriately.


They would just kill filter anyone who reports a lot of spam.

JosephKK

unread,
Mar 25, 2008, 1:54:51 AM3/25/08
to
On Sat, 22 Mar 2008 13:48:44 -0700, SoothSayer
<SayS...@TheMonastery.org> wrote:

>On Sat, 22 Mar 2008 18:21:21 GMT, JosephKK <quiett...@yahoo.com>
>wrote:

>First, they will put you on the "complains too much" IGNORE list.
>
> Then, they will simply BLOCK YOU, completely ignoring your tripe.
>
> An "automated reporting" application is no different than the spam
>clients, and "nukers" out there. In other words, dipshit, you would be
>guilty of the very thing you attempt to complain about.
>
> The answer? Learn how to use your fucking filters, and stop pissing
>and moaning.
>
> Paid ISPs will ignore you as they would lose revenue otherwise. The
>free services ignore you because they are in place to provide anonymous
>posting, and have no concept of "abuse".
>
> As if I am telling you something which you do not already know.

As you guessed from the content of my post (as any longtime USENET
denizen should) I do expect the kinds of responses you predict. Just
the same it could be very useful evidence in a class action suit.
Think on, think hard. What we want is a sufficiently rabid lawyer to
smell money here.

0 new messages