Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Re: Meta: Re: How I deal with the enormous amount of spam

52 views
Skip to first unread message

The Starmaker

unread,
Feb 6, 2024, 1:30:33 PMFeb 6
to
The Starmaker wrote:
>
> Ross Finlayson wrote:
> >
> > On 02/04/2024 12:53 PM, The Starmaker wrote:
> > > The Starmaker wrote:
> > >>
> > >> Ross Finlayson wrote:
> > >>>
> > >>> On 02/04/2024 09:55 AM, Ross Finlayson wrote:
> > >>>> On 02/03/2024 02:46 PM, The Starmaker wrote:
> > >>>>> The Starmaker wrote:
> > >>>>>>
> > >>>>>> Ross Finlayson wrote:
> > >>>>>>>
> > >>>>>>> On 01/30/2024 12:54 PM, Ross Finlayson wrote:
> > >>>>>>>> On Monday, January 29, 2024 at 5:02:05 PM UTC-8, palsing wrote:
> > >>>>>>>>> Tom Roberts wrote:
> > >>>>>>>>>
> > >>>>>>>>>> I use Thunderbird to read Usenet. Recently sci.physics.relativity
> > >>>>>>>>>> has
> > >>>>>>>>>> been getting hundreds of spam posts each day, completely
> > >>>>>>>>>> overwhelming
> > >>>>>>>>>> legitimate content. These spam posts share the property that they
> > >>>>>>>>>> are
> > >>>>>>>>>> written in a non-latin script.
> > >>>>>>>>>
> > >>>>>>>>>> Thunderbird implements message filters that can mark a message
> > >>>>>>>>>> Read. So
> > >>>>>>>>>> I created a filter to run on sci.physics.relativity that marks
> > >>>>>>>>>> messages
> > >>>>>>>>>> Read. Then when reading the newsgroups, I simply display only unread
> > >>>>>>>>>> messages. The key to making this work is to craft the filter so
> > >>>>>>>>>> it marks
> > >>>>>>>>>> messages in which the Subject matches any of a dozen characters
> > >>>>>>>>>> picked
> > >>>>>>>>>> from some spam messages.
> > >>>>>>>>>
> > >>>>>>>>>> This doesn't completely eliminate the spam, but it is now only a few
> > >>>>>>>>>> messages per day.
> > >>>>>>>>>
> > >>>>>>>>>> Tom Roberts
> > >>>>>>>>> I would like to do the same thing, so I installed Thunderbird...
> > >>>>>>>>> but setting it up to read newsgroups is beyond my paltry computer
> > >>>>>>>>> skills and is not at all intuitive. If anyone can point to an
> > >>>>>>>>> idiot-proof tutorial for doing this It would be much appreciated.
> > >>>>>>>>>
> > >>>>>>>>> \Paul Alsing
> > >>>>>>>>
> > >>>>>>>> Yeah, it's pretty bad, or, worse anybody's ever seen it.
> > >>>>>>>>
> > >>>>>>>> I as well sort of mow the lawn a bit or mark the spam.
> > >>>>>>>>
> > >>>>>>>> It seems alright if it'll be a sort of clean break: on Feb 22
> > >>>>>>>> according to Google,
> > >>>>>>>> Google will break its compeerage to Usenet, and furthermore make
> > >>>>>>>> read-only
> > >>>>>>>> the archives, what it has, what until then, will be as it was.
> > >>>>>>>>
> > >>>>>>>> Over on sci.math I've had the idea for a while of making some brief
> > >>>>>>>> and
> > >>>>>>>> special purpose Usenet compeers, for only some few groups, or, you
> > >>>>>>>> know, the _belles lettres_ of the text hierarchy.
> > >>>>>>>>
> > >>>>>>>> "Meta: a usenet server just for sci.math"
> > >>>>>>>> -- https://groups.google.com/g/sci.math/c/zggff_pVEks
> > >>>>>>>>
> > >>>>>>>> So, there you can read the outlook of this kind of thing, then
> > >>>>>>>> while sort
> > >>>>>>>> of simple as the protocol is simple and its implementations
> > >>>>>>>> widespread,
> > >>>>>>>> how to deal with the "signal and noise" of "exposed messaging
> > >>>>>>>> destinations
> > >>>>>>>> on the Internet", well on that thread I'm theorizing a sort of,
> > >>>>>>>> "NOOBNB protocol",
> > >>>>>>>> figuring to make an otherwise just standard Usenet compeer, and
> > >>>>>>>> also for
> > >>>>>>>> email or messaging destinations, sort of designed with the
> > >>>>>>>> expectation that
> > >>>>>>>> there will be spam, and spam and ham are hand in hand, to exclude
> > >>>>>>>> it in simple terms.
> > >>>>>>>>
> > >>>>>>>> NOOBNB: New Old Off Bot Non Bad, Curated/Purgatory/Raw triple-feed
> > >>>>>>>>
> > >>>>>>>> (That and a firmer sort of "Load Shed" or "Load Hold" at the
> > >>>>>>>> transport layer.)
> > >>>>>>>>
> > >>>>>>>> Also it would be real great if at least there was surfaced to the
> > >>>>>>>> Internet a
> > >>>>>>>> read-only view of any message by its message ID, a "URL", or as for
> > >>>>>>>> a "URI",
> > >>>>>>>> a "URN", a reliable perma-link in the IETF "news" protocol, namespace.
> > >>>>>>>>
> > >>>>>>>> https://groups.google.com/g/sci.math/c/zggff_pVEks
> > >>>>>>>>
> > >>>>>>>> I wonder that there's a reliable sort of long-term project that
> > >>>>>>>> surfaces
> > >>>>>>>> "news" protocol message-IDs, .... It's a stable, standards-based
> > >>>>>>>> protocol.
> > >>>>>>>>
> > >>>>>>>>
> > >>>>>>>> Thunderbird, "SLRN", .... Thanks for caring. We care.
> > >>>>>>>>
> > >>>>>>>>
> > >>>>>>>> https://groups.google.com/g/sci.physics.relativity/c/ToBo6XOymUw
> > >>>>>>>>
> > >>>>>>>
> > >>>>>>> One fellow reached me via e-mail and he said, hey, the Googler spam is
> > >>>>>>> outrageous, can we do anything about it? Would you write a script to
> > >>>>>>> funnel all their message-ID's into the abuse reporting? And I was
> > >>>>>>> like,
> > >>>>>>> you know, about 2008 I did just that, there was a big spam flood,
> > >>>>>>> and I wrote a little script to find them and extract their
> > >>>>>>> posting-account,
> > >>>>>>> and the message-ID, and a little script to post to the posting-host,
> > >>>>>>> each one of the wicked spams.
> > >>>>>>>
> > >>>>>>> At the time that seemed to help, they sort of dried up, here there's
> > >>>>>>> that basically they're not following the charter, but, it's the
> > >>>>>>> posting-account
> > >>>>>>> in the message headers that indicate the origin of the post, not the
> > >>>>>>> email address. So, I wonder, given that I can extract the
> > >>>>>>> posting-accounts
> > >>>>>>> of all the spams, how to match the posting-account to then determine
> > >>>>>>> whether it's a sockpuppet-farm or what, and basically about sending
> > >>>>>>> them up.
> > >>>>>>
> > >>>>>> Let me see your little script. Post it here.
> > >>>>>
> > >>>>> Here is a list I currently have:
> > >>>>>
> > >>>>> salz.txt
> > >>>>> usenet.death.penalty.gz
> > >>>>> purify.txt
> > >>>>> NewsAgent110-MS.exe
> > >>>>> HipCrime's NewsAgent (v1_11).htm
> > >>>>> NewsAgent111-BE.zip
> > >>>>> SuperCede.exe
> > >>>>> NewsAgent023.exe
> > >>>>> NewsAgent025.exe
> > >>>>> ActiveAgent.java
> > >>>>> HipCrime's NewsAgent (v1_02)_files
> > >>>>> NewsCancel.java (source code)
> > >>>>>
> > >>>>> (plus updated python versions)
> > >>>>>
> > >>>>>
> > >>>>>
> > >>>>> (Maybe your script is inthere somewhere?)
> > >>>>>
> > >>>>>
> > >>>>>
> > >>>>> Show me what you got. walk the walk.
> > >>>>>
> > >>>>
> > >>>>
> > >>>> I try to avoid sketchy things like hiring a criminal botnet,
> > >>>> there's the impression that that's looking at 1000's of counts
> > >>>> of computer intrusion.
> > >>>>
> > >>>> With those being something about $50K and 10-25 apiece,
> > >>>> there's a pretty significant deterrence to such activities.
> > >>>>
> > >>>> I've never much cared for "OAuth", giving away the
> > >>>> keys-to-the-kingdom and all, here it looks like either
> > >>>> a) a bunch of duped browsers clicked away their identities,
> > >>>> or b) it's really that Google and Facebook are more than
> > >>>> half full of fake identities for the sole purpose of being fake.
> > >>>>
> > >>>> (How's your new deal going?
> > >>>> Great, we got a million users.
> > >>>> Why are my conversions around zero?
> > >>>> Your ad must not speak to them.
> > >>>> Would it help if I spiced it up?
> > >>>> Don't backtalk me, I'll put you on a list!)
> > >>>>
> > >>>> So, it seems mostly a sort of "spam-walling the Internet",
> > >>>> where it was like "we're going to reinvent the Internet",
> > >>>> "no, you aren't", "all right then we'll ruin this one".
> > >>>>
> > >>>> As far as search goes, there's something to be said
> > >>>> for a new sort of approach to search, given that
> > >>>> Google, Bing, Duck, ..., _all make the same results_. It's
> > >>>> just so highly unlikely that they'd _all make the same
> > >>>> results_, you figure they're just one.
> > >>>>
> > >>>> So, the idea, for somebody like me who's mostly interested
> > >>>> in writing on the Internet, is that lots of that is of the sort
> > >>>> of "works" vis-a-vis, the "feuilleton" or what you might
> > >>>> call it, ephemeral junk, that I just learned about in
> > >>>> Herman Hesse's "The Glass Bead Game".
> > >>>>
> > >>>> Then, there's an idea, that basically to surface high-quality
> > >>>> works to a search, is that there's what's called metadata,
> > >>>> for content like HTML, with regards to Dublin Core and
> > >>>> RDF and so on, about a sort of making for fungible collections
> > >>>> of works, what results searchable fragments of various
> > >>>> larger bodies of works, according to their robots.txt and
> > >>>> their summaries and with regards to crawling the content
> > >>>> and so on, then to make federated common search corpi,
> > >>>> these kinds of things.
> > >>>>
> > >>>>
> > >>>>
> > >>>
> > >>> It's like "why are they building that new data center",
> > >>> and it's like "well it's like Artificial Intelligence, inside
> > >>> that data center is a million virts and each one has a
> > >>> browser emulator and a phone app sandbox and a
> > >>> little notecard that prompts its name, basically it's
> > >>> a million-headed hydra called a sims-bot-farm,
> > >>> that for pennies on the dollar is an instant audience."
> > >>>
> > >>> "Wow, great, do they get a cut?" "Don't be talking about my cut."
> > >>>
> > >>> Usenet traffic had been up recently, ....
> > >>>
> > >>> I think they used to call it "astro-turfing".
> > >>> "Artificial Intelligence?" "No, 'Fake eyeballs'."
> > >>
> > >> I have NewsAgent111-MS.exe
> > >>
> > >> I seem to be missing version 2.0
> > >>
> > >> Do you have the 2.0 version?
> > >>
> > >> I'll trade you.
> > >>
> > >> I'll give you my python version with (GUI)!!!! (Tinter)
> > >>
> > >> let's trade!
> > >>
> > >> don't bogart
> > >
> > > I seem to be missing this version:
> > >
> > > https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
> > >
> > > Do you have it? you must have!
> > >
> > >
> > >
> > >
> > >
> >
> > Nope, I just wrote a little script to connect to NNTP
> > with a Yes/No button on the subject, tapped through
> > those, and a little script to send an HTTP request to
> > the publicly-facing return-to-sender in-box, for each.
> >
> > Here's all the sources you need: IETF RFC editor.
> > Look for "NNTP". How to advise Google of this is
> > that each domain on the Internet is supposed to
> > have an "abuse@domain" email inbox, though there's
> > probably also a web request interface, as with regards
> > to publicly facing services, and expected to be
> > good actors on the network.
> >
> > Anyways if you read through "Meta: a usenet server
> > just for sci.math", what I have in mind is a sort
> > of author's and writer's oriented installation,
> > basically making for vanity printouts and generating
> > hypertext collections of contents and authors and
> > subjects and these kinds of things, basically for
> > on the order of "find all the postings of Archimedes
> > Plutonium, and, the threads they are in, and,
> > make a hypertext page of all that, a linear timeline,
> > and also thread it out as a linear sequence".
> >
> > I.e. people who actually post to Usenet are sometimes
> > having written interesting things, and, thus having
> > it so that it would be simplified to generate message-ID
> > listings and their corresponding standard URL's in the
> > standard IETF "news" URL protocol, and to point that
> > at a given news server or like XLink, is for treating
> > Usenet its archives like a living museum of all these
> > different authors posts and their interactions together.
> >
> > I.e., here it's "belles lettres" and "fair use",
> > not just "belles" and "use".
> >
> > It seemed nice of Google Groups to front this for a long time,
> > now they're quitting.
> >
> > I imagine Internet Relay Chat's still insane, though.
> >
> > Anyways I stay away from any warez and am proud that
> > since about Y2K at least I've never bootlegged anything,
> > and never uploaded a bootleg. Don't want to give old Shylock
> > excuses, and besides, I wrote software for a living.
>
> Anyways, I don't know who was talking about "any warez" or "bootlegs",
> since I was refering to programs and scripts that reads:
>
> "FREE, which means you can copy it and redistribute"
> "Similarly, the source is provided as reference and can be redistributed
> freely as well. "
>
> HipCrime's NewsAgent (v2.0) is FREE, which means you can copy it and
> redistribute it at will, as long as you give credit to the original
> author. Similarly, the source is provided as reference and can be
> redistributed freely as well.
>
> https://web.archive.org/web/20051023050609/http://newsagent.p5.org.uk/
>
> You seem to be too much 'in your head', on a high horse...
>
> "FREE, which means you can copy it and redistribute"
> "Similarly, the source is provided as reference and can be redistributed
> freely as well. "
>
> So, show me that wicked script you wrote : "funnel all their
> message-ID's"
> by people you call spammers who 'funnel' their products and services
> through Usenet newsgroups.
>
> You are sooooo wicked.
>
> and a nanofossils

Anyways, there is only one person that know what 'nanofossils' means,
and that is Ross Finlayson.


I just realized that Ross Finlayson doesn't know of NEWSAGENT.


Anyways, ...

"Anyways"???? Who talks like that?


Anyways..


the problem of the 'flooding' is not the spammers, it's the 'scientific
community'. They caused the problem.
They removed the feature that NewsAgent used to get rid of ALL flooding
and spammers. But, but, the
members of the scientific community could not trust their own members to
use it against them.

If one member of the 'scientific community' disagreed with another
member of the 'scientific community'...they were removed!


Too much power.


I called it...God Mode.











--
The Starmaker -- To question the unquestionable, ask the unaskable,
to think the unthinkable, mention the unmentionable, say the unsayable,
and challenge the unchallengeable.

The Starmaker

unread,
Feb 6, 2024, 2:02:04 PMFeb 6
to
Using NewsAgent in God Mode was great! Ecept...if you didn't know how to use it
properly you can make a mistake and remove *EVERYONE'S* posts by accident.

Everyone just completely disapeared!


Oops. i made a booboo.


Like that Twilight Zone episode where everyone disapears by a click of a watch.



Where is everybody? MAJOR KILLFILE!

So, which is worse?

The Starmaker

unread,
Feb 6, 2024, 2:23:05 PMFeb 6
to
You know what GOD MODE Killfile is? That means you not only killed filed everyone, but you also sort of
turn everyones esles killfile on. Nobody sees nobody.


It's like the Atomic Bomb of Usenet!


(only yous guys make bombs like that)
(then yous get angry when everyone has the atomic bomb)

typical.

The Starmaker

unread,
Feb 11, 2024, 3:37:19 PMFeb 11
to
As I mentioned above,
the problem of the 'flooding' is not the spammers,
it's the 'scientific community'. They caused the problem.
They removed the feature that was used to get rid of ALL flooding...

Why? Because of the War of the Gods.

If one member of the 'scientific community' disagreed with another
member, or especially if he had a higher IQ...they were removed!

War of the Gods.

I will have no other gods before me...they think of themselves.


Albert Einstein was a Jewish Supremacists.

If you disagreed with him, you were called
"not man of science" and "not Jewish".

Of course his people would blacklist yous.


So, put back the feature that gets rid of the spamming flooders and
watch what will happen to the rests of yous.


It will be again...The War of the Gods.


And I will not have any Gods before me either.


off wit your heads!

Only those below average intelligence should dominate the Usenet!

anything above dat...banished!

Physfitfreak

unread,
Feb 12, 2024, 5:28:02 PMFeb 12
to
On 2/11/2024 2:37 PM, The Starmaker wrote:
> As I mentioned above,
> the problem of the 'flooding' is not the spammers,
> it's the 'scientific community'. They caused the problem.
> They removed the feature that was used to get rid of ALL flooding...
>
> Why? Because of the War of the Gods.


So a comic book is resposible for that?

--
This email has been checked for viruses by Avast antivirus software.
www.avast.com

The Starmaker

unread,
Feb 19, 2024, 2:05:58 PMFeb 19
to
So, where do you people get the idea AFTER Feb 22, the flooding will
stop???


Are yous saying it's all coming from Google Groups source website???

Hooker Tzaran Balanowsky

unread,
Feb 19, 2024, 6:17:33 PMFeb 19
to
Dlzc wrote:
> No. They are saying that when Google stops crawling USENET for content,
> any spam posted to USENET will be invisible to search engines (or at
> least the Google search engine). The advertisers don't get paid, so
> wasting time posting to USENET becomes pointless.

not sure you undrestand the procedure. The flooding comes most likely from
google itself, wanting 𝘁𝗼_𝗱𝗲𝘀𝘁𝗿𝗼𝘆_𝘁𝗵𝗲 𝘂𝘀𝗲𝗻𝗲𝘁. Making money out of it.

otherwise they could stop it long ago. Obvious a wanker 𝗸𝗵𝗮𝘇𝗮𝗿 𝗴𝗼𝘆 from
amrica, their own 𝘁𝗲𝗮𝗺 configuration. They shit together. It's a satanic
club.

Volney

unread,
Feb 19, 2024, 8:44:36 PMFeb 19
to
The Thai casino spam is from Google and will (should) stop.

Any spammers not spamming through Google can continue to spam, but if
Google is no longer indexing Usenet, it will no longer be profitable for
them.

Ross Finlayson

unread,
Feb 19, 2024, 11:42:43 PMFeb 19
to
I think it's a sham for "spam-walling the Internet".

I.e. "Aw you got spammed, good-bye. Good luck standing
up any publicly-facing service." It seems a lot just like
the any other sort of protection racket bit.

I mean it's "thousands of fake Google accounts", ...,
or "gee thanks OAuth", maybe it's just the terrible
decision to keys-to-the-kingdom by thousands of stupid
gamblers who click-through a creds-farm to see a breast,
like "Are you sure?", "Are you sure you can't read English?".
Either way they're dupes.

If you look at something like narkiv or novabbs, they
just put some filters on their INND or what and not much spam there.

On the usual hired Usenet providers they are statedly anti-spam,
and there's some cancel-bot activity going on them, though for
example it gets out variously, each Usenet site can have its own
policy. Seems the idea is "don't post links".

So, it's sort of varying as a patch-work. There's lots to it,
though.


It was nice of Google to front to Usenet these years,
what'll be missing will be "easy search", so the idea
is to make "a Library/Museum Browse, Exhibit, Tour, Carrel"
type experience, "fuller search".


Hopefully there'll be a revival. Have you ever seen
a tent revival? Sometimes those can get pretty good.


I'm trying to figure out standard files for standard protocols
for reference implementations and reference editions,
of bring-your-own or brew-your-own, Usenet and this kind of thing.
Over on sci.math it's called "Meta: a usenet server just for sci.math".

Currently it's name idea is like "usenet.science", but, it's variable.



0 new messages