Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Go-http-client

20 views
Skip to first unread message

Martin

unread,
Jan 31, 2024, 5:39:14 AMJan 31
to
In the last couple of days my website has had an increase in traffic,
from about 30 different IP addresses, all with a User-Agent of
"Go-http-client/1.1".

Each starts with a "GET / HTTP/1.1" request, with various User-Agents,
including Windows, Linux & MaxOS. If that works (as it will) it then
issues GETs for about 30 varied files, then stops.

It seems that Go-http-client is a package which "provides HTTP client
and server implementations" but it is suddenly being used by lots of
IPs in a suspicious way.

Anyone else seen this?

They obviously do not abide by robots/txt (or even read it), so the
only way I know to block them is to add them to /htaccess as deny
froms - some have the same top two numbers.

Are there any better ways?
One way is just to ignore them, I know, but I would not want a trickle
to turn into a flood.

Martin

--
Martin Avison
Note that unfortunately this email address will become invalid
without notice if (when) any spam is received.

Chris Hughes

unread,
Jan 31, 2024, 7:00:12 AMJan 31
to
In message <5b2b4df6...@avisoft.f9.co.uk>
Martin <New...@avisoft.f9.co.uk> wrote:

> In the last couple of days my website has had an increase in traffic,
> from about 30 different IP addresses, all with a User-Agent of
> "Go-http-client/1.1".

> Each starts with a "GET / HTTP/1.1" request, with various User-Agents,
> including Windows, Linux & MaxOS. If that works (as it will) it then
> issues GETs for about 30 varied files, then stops.

> It seems that Go-http-client is a package which "provides HTTP client
> and server implementations" but it is suddenly being used by lots of
> IPs in a suspicious way.

> Anyone else seen this?

> They obviously do not abide by robots/txt (or even read it), so the
> only way I know to block them is to add them to /htaccess as deny
> froms - some have the same top two numbers.

> Are there any better ways?
> One way is just to ignore them, I know, but I would not want a trickle
> to turn into a flood.

Is your web space provided via PlusNet ?

If so you could report possible suspicious activity.


--
Chris Hughes

Martin

unread,
Jan 31, 2024, 7:40:46 AMJan 31
to
In article <a56a552b5b.chris@mytardis>,
Yes ... but I doubt they would be interested at the current level.

Chris Hughes

unread,
Jan 31, 2024, 8:00:11 AMJan 31
to
In message <5b2b591f...@avisoft.f9.co.uk>
I meant to say via PlusNet's Community Forum, which often gets a faster
response then ringing the normal customer support, as they frequently
don't seem to know some users have web space! as you use a legacy system
i.e. force9



--
Chris Hughes

Martin

unread,
Jan 31, 2024, 8:25:46 AMJan 31
to
In article <e9095c2b5b.chris@mytardis>,
Aaah yes - that is a good idea. Thanks.

Theo

unread,
Jan 31, 2024, 10:18:31 AMJan 31
to
Martin <New...@avisoft.f9.co.uk> wrote:
> In the last couple of days my website has had an increase in traffic,
> from about 30 different IP addresses, all with a User-Agent of
> "Go-http-client/1.1".
>
> Each starts with a "GET / HTTP/1.1" request, with various User-Agents,
> including Windows, Linux & MaxOS. If that works (as it will) it then
> issues GETs for about 30 varied files, then stops.
>
> It seems that Go-http-client is a package which "provides HTTP client
> and server implementations" but it is suddenly being used by lots of
> IPs in a suspicious way.
>
> Anyone else seen this?

Looking at the riscos.info logs, there's a variety of entries matching that.
Since the start of December there have been 1632 requests.
Some examples (I have redacted part of the IPs, but they're all with
completely different prefixes):

Testing if the site will proxy for another:

106.2.x.x - - [19/Jan/2024:11:14:23 +0000] "CONNECT www.whitehouse.gov:443 HTTP/1.1" 302 292 "-" "Go-http-client/1.1"
80.91.x.x - - [20/Jan/2024:11:30:17 +0000] "CONNECT google.com:443 HTTP/1.1" 302 284 "-" "Go-http-client/1.1"

Testing for vulnerable pages:

91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //alfa.php HTTP/1.1" 404 287 "-" "Go-http-client/1.1"
91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //doc.php HTTP/1.1" 404 286 "-" "Go-http-client/1.1"
91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //marijuana.php HTTP/1.1" 404 292 "-" "Go-http-client/1.1"
91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //mini.php HTTP/1.1" 404 287 "-" "Go-http-client/1.1"
91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //shell.php HTTP/1.1" 404 288 "-" "Go-http-client/1.1"
91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //small.php HTTP/1.1" 404 288 "-" "Go-http-client/1.1"
91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //wso.php HTTP/1.1" 404 286 "-" "Go-http-client/1.1"
91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //wp-info.php HTTP/1.1" 404 290 "-" "Go-http-client/1.1"

A legit access followed by some probing:

195.20.x.x - - [06/Dec/2023:05:14:05 +0000] "GET / HTTP/1.1" 302 287 "-" "Go-http-client/1.1"
195.20.x.x - - [06/Dec/2023:05:14:16 +0000] "GET / HTTP/1.1" 301 26 "-" "Go-http-client/1.1"
195.20.x.x - - [06/Dec/2023:05:14:17 +0000] "GET /index.php/RISC_OS HTTP/1.1" 200 7210 "http://www.riscos.info/" "Go-http-client/1.1"
195.20.x.x - - [06/Dec/2023:05:14:19 +0000] "GET /+CSCOE+/logon.html HTTP/1.1" 302 305 "-" "Go-http-client/1.1"
195.20.x.x - - [06/Dec/2023:05:14:50 +0000] "GET /global-protect/login.esp HTTP/1.1" 302 311 "-" "Go-http-client/1.1"
195.20.x.x - - [06/Dec/2023:05:14:50 +0000] "GET /global-protect/login.esp HTTP/1.1" 404 303 "-" "Go-http-client/1.1"

The ownership of some of those prefixes is:

netname: Netease-Network
descr: Guangzhou NetEase Computer System Co.,Ltd
country: CN

organisation: ORG-FZTA3-RIPE
org-name: Ferdinand Zink trading as Tube-Hosting
country: DE

organisation: ORG-LA1853-RIPE
org-name: Limenet
org-type: OTHER
address: 84 W Broadway, Ste 200
address: 03038 Derry
address: United States of America

organisation: ORG-GL496-RIPE
org-name: Shelter LLC
country: RU

so not a geographic pattern.

> They obviously do not abide by robots/txt (or even read it), so the
> only way I know to block them is to add them to /htaccess as deny
> froms - some have the same top two numbers.
>
> Are there any better ways?
> One way is just to ignore them, I know, but I would not want a trickle
> to turn into a flood.

They appear to just be probing for vulnerable sites. I don't think anything
you do will affect the rate, they are just picking targets at random. I'd
guess it's just coming from a malware toolkit of some kind that happens to
be programmed in Go, possibly running through a botnet.

I doubt any kind of IP filtering is going to work. So it boils down to
hot they're bothering you - filling up the log (something that's been
happening to riscos.info a few times of late), eating your bandwidth or CPU.

There are too many IPs to block in firewall rules. You could block accesses
from Go-http-client, but I think it would still log as blocked. Mostly from
the above they aren't actually interacting with real content on the site so
the CPU is not doing much serving real pages, and the 302/404 traffic is
minimal (~300 bytes per request). Maybe some kind of adaptive
firewalling/rate limiting, but that would probably block genuine traffic.

Unless you have scripts on your site that are actually vulnerable (in which
case you should fix them) I'm not sure there's much to be done. If you
provide a site on the internet, people (or bots) on the internet connect to
it. That's the deal.

Theo

Martin

unread,
Jan 31, 2024, 12:04:11 PMJan 31
to
In article <kGr*WK...@news.chiark.greenend.org.uk>,
Theo <theom...@chiark.greenend.org.uk> wrote:
> Martin <New...@avisoft.f9.co.uk> wrote:
> > In the last couple of days my website has had an increase in
> > traffic, from about 30 different IP addresses, all with a
> > User-Agent of "Go-http-client/1.1".
> >
> > Each starts with a "GET / HTTP/1.1" request, with various
> > User-Agents, including Windows, Linux & MaxOS. If that works (as
> > it will) it then issues GETs for about 30 varied files, then
> > stops.
> >
> > It seems that Go-http-client is a package which "provides HTTP
> > client and server implementations" but it is suddenly being used
> > by lots of IPs in a suspicious way.
> >
> > Anyone else seen this?

> Looking at the riscos.info logs, there's a variety of entries
> matching that. Since the start of December there have been 1632
> requests.

I have had over 800 in the previous 2 days.

> Some examples (I have redacted part of the IPs, but
> they're all with completely different prefixes):

> Testing if the site will proxy for another:

Not seen any like that.

> Testing for vulnerable pages:

Or that!

> A legit access followed by some probing:

All mine have been to existing pages or files - all returned with
status 200.

> The ownership of some of those prefixes is:

Mine seemed to be allocated to Asia Pacific (APNIC).
Difficult these days to get more precise information.

> They appear to just be probing for vulnerable sites. I don't think
> anything you do will affect the rate, they are just picking targets
> at random. I'd guess it's just coming from a malware toolkit of
> some kind that happens to be programmed in Go, possibly running
> through a botnet.

Probably - Googling 'botnet using go-http-client' gives lots of hits!

> I doubt any kind of IP filtering is going to work. So it boils
> down to hot they're bothering you - filling up the log (something
> that's been happening to riscos.info a few times of late), eating
> your bandwidth or CPU.

They are certainly vastly increasing my bandwidth usage, though I have
not quantified it.

> There are too many IPs to block in firewall rules. You could block
> accesses from Go-http-client, but I think it would still log as
> blocked. Mostly from the above they aren't actually interacting
> with real content on the site so the CPU is not doing much serving
> real pages, and the 302/404 traffic is minimal (~300 bytes per
> request). Maybe some kind of adaptive firewalling/rate limiting,
> but that would probably block genuine traffic.

MIne are downloading real files (including zips) with status 200.

> Unless you have scripts on your site that are actually vulnerable
> (in which case you should fix them) I'm not sure there's much to be
> done.

No scripts here. Just plain HTML.

> If you provide a site on the internet, people (or bots) on
> the internet connect to it. That's the deal.

Yes, indeed. I will just keep an eye open for the moment.

Thanks

Theo

unread,
Jan 31, 2024, 1:32:22 PMJan 31
to
Martin <New...@avisoft.f9.co.uk> wrote:
> In article <kGr*WK...@news.chiark.greenend.org.uk>,
> Theo <theom...@chiark.greenend.org.uk> wrote:
>
> I have had over 800 in the previous 2 days.
>
> All mine have been to existing pages or files - all returned with
> status 200.
>
> Mine seemed to be allocated to Asia Pacific (APNIC).
> Difficult these days to get more precise information.

Try a 'whois' on the IP, it should tell you the Autonomous System (AS) which
owns the IP range. That is usually an ISP but can sometimes be a company.
Of course you'd need to talk to them to go any further.

> MIne are downloading real files (including zips) with status 200.
>
> No scripts here. Just plain HTML.

I would guess somebody's using a tool to crawl your site, for what purpose
we don't know. It happens to be written using a popular Go HTTP library and
they didn't change the User-Agent. It doesn't sound like the same kind of
probing I'm seeing.

I've been seeing a lot of crawls from AI companies (Bytedance, Facebook) who
are sucking data for training AI models. Perhaps they are doing something
similar.

Theo

Martin

unread,
Feb 22, 2024, 4:54:09 AMFeb 22
to
In article <5b2b4df6...@avisoft.f9.co.uk>,
Martin <New...@avisoft.f9.co.uk> wrote:
> In the last couple of days my website has had an increase in
> traffic, from about 30 different IP addresses, all with a
> User-Agent of "Go-http-client/1.1".

> Each starts with a "GET / HTTP/1.1" request, with various
> User-Agents, including Windows, Linux & MaxOS. If that works (as it
> will) it then issues GETs for about 30 varied files, then stops.

> It seems that Go-http-client is a package which "provides HTTP
> client and server implementations" but it is suddenly being used by
> lots of IPs in a suspicious way.

> Anyone else seen this?

> They obviously do not abide by robots/txt (or even read it), so the
> only way I know to block them is to add them to /htaccess as deny
> froms - some have the same top two numbers.

> Are there any better ways? One way is just to ignore them, I know,
> but I would not want a trickle to turn into a flood.

The trickle continued, some days far outnumbering other requests.

But I have found a way to stop them! I added to my ./htaccess file...

RewriteCond %{HTTP_USER_AGENT} "=Go-http-client/1.1"
RewriteRule .* - [F,L]

... now returns 403 Forbidden. Stopped 260 in 12 hours yesterday.

This certainly works on PlusNet - may or may not on other ISPs.
0 new messages