Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

wget fails in Debian Jessie

29 views
Skip to first unread message

Leslie Rhorer

unread,
Jun 7, 2015, 12:00:04 AM6/7/15
to
I had a script running just fine under Debian Squeeze, but that server is now offline for repair and upgrade to Jessie, so I am now running the script under Jessie, and the script is failing when attempting to scrape data off a website using wget. Under Jessie, wget produces an SSL error when it tries to log in to the website. There is an open bug report #785016, to which I added a comment, but so far no response has come from the developers. Needing to get this script fully working, I am attempting to use curl for the purpose, but I'm not having any luck there, either. I don't get the error, but I don't get the correct web page, either. These are the two lines using wget:

wget --save-cookies cookies.txt --keep-session-cookies --no-check-certificate --post-data 'timeOffset=300&UserName=xxxxx%40mygrande.net&Password=yyyyy&RememberMe=false' https://mytotalconnectcomfort.com/portal

wget --load-cookies cookies.txt --no-check-certificate https://mytotalconnectcomfort.com/portal/188049/Zones/page2

I tried the following using curl:

curl -c cookies.txt -d 'timeOffset=300&UserName=xxxxxxx%40mygrande.net&Password=yyyyyyy&RememberMe=false' https://mytotalconnectcomfort.com/portal

Instead of returning the main page, it just returns:

<html><head><title>Object moved</title></head><body>
<h2>Object moved to <a href="/portal/">here</a>.</h2>
</body></html>

What it should return is this:

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">



<!--[if IE 8 ]> <html class="ie8" xmlns="http://www.w3.org/1999/xhtml"> <![endif]-->
<!--[if IE 9 ]> <html class="ie9" xmlns="http://www.w3.org/1999/xhtml"> <![endif]-->
<!--[if (gt IE 9)|!(IE)]><!-->
<html xmlns="http://www.w3.org/1999/xhtml">
<!--<![endif]-->
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta http-equiv="content-type" content="text/html; charset=UTF-8" />

<title>Thermostat(s) in Home</title>

...

data-url="/portal/Device/Control/43935?page=1" data-clickenabled="True">
<td class="leftendcap" />
<td class="location-zone-title">
<div class="location-name">Diningroom</div>
</td>
<td class="zone-temperature">

<span class="tempValue" style="">
66&deg;
</span>
</td>

...

data-url="/portal/Device/Control/220861?page=1" data-clickenabled="True">
<td class="leftendcap" />
<td class="location-zone-title">
<div class="location-name">Guest</div>
</td>
<td class="zone-temperature">

<span class="tempValue" style="">
78&deg;
</span>
</td>

...

data-url="/portal/Device/Control/219808?page=1" data-clickenabled="True">
<td class="leftendcap" />
<td class="location-zone-title">
<div class="location-name">Leslie</div>
</td>
<td class="zone-temperature">

<span class="tempValue" style="">
73&deg;
</span>
</td>

... etc.

from which I would be able to scrape the temperatures. Does anyone have any ideas how I could get curl to handle the task, since wget is failing? Some other utility?


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/edcaf0db-12d4-498b...@googlegroups.com

Reco

unread,
Jun 7, 2015, 5:00:04 AM6/7/15
to
Hi.

On Sat, 6 Jun 2015 20:31:44 -0700 (PDT)
Leslie Rhorer <lrh...@mygrande.net> wrote:

> Does anyone have any ideas how I could get curl to handle the task, since wget is failing? Some other utility?

Don't depend on curl. Use good old socat combined with wget:

socat TCP4-LISTEN:8080,reuseaddr,fork \
OPENSSL:mytotalconnectcomfort.com:443,verify=0

wget --save-cookies cookies.txt --keep-session-cookies \
--post-data '…' http://localhost:8080/portal

wget --load-cookies cookies.txt \
http://localhost:8080/portal/188049/Zones/page2

pkill socat

Reco


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/20150607114955.6b61...@gmail.com

Eduardo M KALINOWSKI

unread,
Jun 7, 2015, 7:10:03 AM6/7/15
to
On 06/07/2015 12:31 AM, Leslie Rhorer wrote:
> I had a script running just fine under Debian Squeeze, but that server is now offline for repair and upgrade to Jessie, so I am now running the script under Jessie, and the script is failing when attempting to scrape data off a website using wget. Under Jessie, wget produces an SSL error when it tries to log in to the website. There is an open bug report #785016, to which I added a comment, but so far no response has come from the developers. Needing to get this script fully working, I am attempting to use curl for the purpose, but I'm not having any luck there, either. I don't get the error, but I don't get the correct web page, either. These are the two lines using wget:
>
> wget --save-cookies cookies.txt --keep-session-cookies --no-check-certificate --post-data 'timeOffset=300&UserName=xxxxx%40mygrande.net&Password=yyyyy&RememberMe=false' https://mytotalconnectcomfort.com/portal
>
> wget --load-cookies cookies.txt --no-check-certificate https://mytotalconnectcomfort.com/portal/188049/Zones/page2
>
> I tried the following using curl:
>
> curl -c cookies.txt -d 'timeOffset=300&UserName=xxxxxxx%40mygrande.net&Password=yyyyyyy&RememberMe=false' https://mytotalconnectcomfort.com/portal
>
> Instead of returning the main page, it just returns:
>
> <html><head><title>Object moved</title></head><body>
> <h2>Object moved to <a href="/portal/">here</a>.</h2>
> </body></html>
>
> [snip]
>
> Does anyone have any ideas how I could get curl to handle the task, since wget is failing? Some other utility?

Well, for starters, try using the URL
https://mytotalconnectcomfort.com/portal/ (note trailing slash) as
suggested by the error message. Alternatively, there might also be a
curl option to automatically follow redirects.


--
We're overpaying him, but he's worth it. -Samuel Goldwyn

Eduardo M KALINOWSKI
edu...@kalinowski.com.br


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/55742547...@kalinowski.com.br

to...@tuxteam.de

unread,
Jun 7, 2015, 7:20:04 AM6/7/15
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On Sun, Jun 07, 2015 at 08:04:39AM -0300, Eduardo M KALINOWSKI wrote:
> On 06/07/2015 12:31 AM, Leslie Rhorer wrote:
> > I had a script running just fine under Debian Squeeze, but that server is now offline for repair and upgrade to Jessie, so I am now running the script under Jessie, and the script is failing when attempting to scrape data off a website using wget. Under Jessie, wget produces an SSL error when it tries to log in to the website. There is an open bug report #785016, to which I added a comment, but so far no response has come from the developers. Needing to get this script fully working, I am attempting to use curl for the purpose, but I'm not having any luck there, either. I don't get the error, but I don't get the correct web page, either. These are the two lines using wget:
> >
> > wget --save-cookies cookies.txt --keep-session-cookies --no-check-certificate --post-data 'timeOffset=300&UserName=xxxxx%40mygrande.net&Password=yyyyy&RememberMe=false' https://mytotalconnectcomfort.com/portal
> >
> > wget --load-cookies cookies.txt --no-check-certificate https://mytotalconnectcomfort.com/portal/188049/Zones/page2
> >
> > I tried the following using curl:
> >
> > curl -c cookies.txt -d 'timeOffset=300&UserName=xxxxxxx%40mygrande.net&Password=yyyyyyy&RememberMe=false' https://mytotalconnectcomfort.com/portal
> >
> > Instead of returning the main page, it just returns:
> >
> > <html><head><title>Object moved</title></head><body>
> > <h2>Object moved to <a href="/portal/">here</a>.</h2>
> > </body></html>

This is most probably the sign of a redirect (see below)

[...]

> Well, for starters, try using the URL
> https://mytotalconnectcomfort.com/portal/ (note trailing slash) as
> suggested by the error message.

This might be a good check.

> Alternatively, there might also be a
> curl option to automatically follow redirects.

There is: it's option -L (or --location). Additionally, you can set the
maximum number of redirects to follow with --max-redirs

regards
- -- t
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.12 (GNU/Linux)

iEYEARECAAYFAlV0KBAACgkQBcgs9XrR2kaHbgCfb/rSDIF0CpbvhQbRO2Ltg1a2
atMAnjjizypx6s7xUEk+qXWtLSDvLrKx
=g/ep
-----END PGP SIGNATURE-----


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/20150607111...@tuxteam.de

Leslie Rhorer

unread,
Jun 7, 2015, 5:20:04 PM6/7/15
to
On Sunday, June 7, 2015 at 4:00:04 AM UTC-5, Reco wrote:

> > Does anyone have any ideas how I could get curl to handle the task, since wget is failing? Some other utility?
>
> Don't depend on curl. Use good old socat combined with wget:

Why? The -L option in curl did the trick. Is there some over-riding reason why I should use wget instead of curl? Curl windsup being simpler and faster in this case.
>
> socat TCP4-LISTEN:8080,reuseaddr,fork \
> OPENSSL:mytotalconnectcomfort.com:443,verify=0
>
> wget --save-cookies cookies.txt --keep-session-cookies \
> --post-data '...' http://localhost:8080/portal
>
> wget --load-cookies cookies.txt \
> http://localhost:8080/portal/188049/Zones/page2
>
> pkill socat
>
> Reco
>
>
> --
> To UNSUBSCRIBE, email to debian-us...@lists.debian.org
> with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
> Archive: https://lists.debian.org/20150607114955.6b61...@gmail.com


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/d51388f2-5f49-4c8c...@googlegroups.com

Leslie Rhorer

unread,
Jun 7, 2015, 5:20:04 PM6/7/15
to
> > > Instead of returning the main page, it just returns:
> > >
> > > <html><head><title>Object moved</title></head><body>
> > > <h2>Object moved to <a href="/portal/">here</a>.</h2>
> > > </body></html>
>
> This is most probably the sign of a redirect (see below)
>
> [...]
>
> > Well, for starters, try using the URL
> > https://mytotalconnectcomfort.com/portal/ (note trailing slash) as
> > suggested by the error message.
>
> This might be a good check.

No, I had already tried that.

> > Alternatively, there might also be a
> > curl option to automatically follow redirects.
>
> There is: it's option -L (or --location). Additionally, you can set the
> maximum number of redirects to follow with --max-redirs

That did it! Thanks.


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/61dff2b4-00a8-4817...@googlegroups.com

David Wright

unread,
Jun 7, 2015, 8:10:03 PM6/7/15
to
Quoting Leslie Rhorer (lrh...@mygrande.net):
> On Sunday, June 7, 2015 at 4:00:04 AM UTC-5, Reco wrote:
>
> > > Does anyone have any ideas how I could get curl to handle the task, since wget is failing? Some other utility?
> >
> > Don't depend on curl. Use good old socat combined with wget:
>
> Why? The -L option in curl did the trick. Is there some over-riding reason why I should use wget instead of curl? Curl windsup being simpler and faster in this case.

I'm not commenting on this particular case, but the default options in
curl are a pain in the proverbial.

As I mentioned 8 April, curl outputs to stdout so you've got to set
-O to get the "correct" filename.

Then you need -R to get the correct timestamp applied.

You also need to check for the existence of a file of the same name
else curl will silently overwrite it. I haven't figured out an alias
to prevent this.

wget handles these cases correctly. curl might be fine for scripting
but I find wget far friendlier for interactive use.

Cheers,
David.


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/20150608000329.GA18375@alum

to...@tuxteam.de

unread,
Jun 8, 2015, 4:40:03 AM6/8/15
to
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On Sun, Jun 07, 2015 at 07:03:29PM -0500, David Wright wrote:
> Quoting Leslie Rhorer (lrh...@mygrande.net):
> > On Sunday, June 7, 2015 at 4:00:04 AM UTC-5, Reco wrote:
> >
> > > > Does anyone have any ideas how I could get curl to handle the task, since wget is failing? Some other utility?
> > >
> > > Don't depend on curl. Use good old socat combined with wget:
> >
> > Why? The -L option in curl did the trick [...]
>
> I'm not commenting on this particular case, but the default options in
> curl are a pain in the proverbial.

FWIW, i happily use both (when I want to download recursively a bunch of
pages, following links, wget is the one; when I'm debugging some http
strangeness, it's curl). Interactively and in scripts. And I *never*
would give such an advice as "Don't depend on curl".

And as to "curl's default options ..." -- they may be a pain in *your*
proverbial; that doesn't make them a pain in everyone's proverbial.
I'm perfectly fine with them.

All generalizations suck ;-)

- -- t
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.12 (GNU/Linux)

iEYEARECAAYFAlV1VCgACgkQBcgs9XrR2ka28gCfS4nm/+1j6JHj/u+S0XvvmHDN
pNQAmwcAiANN12BMZpEcHNjAMhjO1v36
=ycgo
-----END PGP SIGNATURE-----


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/20150608083...@tuxteam.de

Leslie Rhorer

unread,
Jun 8, 2015, 4:50:03 PM6/8/15
to
On Sunday, June 7, 2015 at 7:10:03 PM UTC-5, David Wright wrote:
> Quoting Leslie Rhorer:
> > On Sunday, June 7, 2015 at 4:00:04 AM UTC-5, Reco wrote:
> >
> > > > Does anyone have any ideas how I could get curl to handle the task, since wget is failing? Some other utility?
> > >
> > > Don't depend on curl. Use good old socat combined with wget:
> >
> > Why? The -L option in curl did the trick. Is there some over-riding reason why I should use wget instead of curl? Curl windsup being simpler and faster in this case.
>
> I'm not commenting on this particular case, but the default options in
> curl are a pain in the proverbial.

I was asking about this case.

> As I mentioned 8 April, curl outputs to stdout so you've got to set
> -O to get the "correct" filename.

Since it's easier in this case *NOT* to have the output sent to a file (or worse, a directory structure), curl is easier.

> Then you need -R to get the correct timestamp applied.

I don't care about the timestamp. All I need to do is scrape the room temperatures (a total of 20 bytes) from the entire output of several KB.

> You also need to check for the existence of a file of the same name
> else curl will silently overwrite it. I haven't figured out an alias
> to prevent this.

Since I *want* the file overwritten every time the script runs, this isn't a problem. Indeed, it is preferred. That, plus I don't have curl write to the file directly. I pipe the result to grep and then sed, and then redirect that output to the file. One line with curl. It was over a dozen with wget.

> wget handles these cases correctly. curl might be fine for scripting
> but I find wget far friendlier for interactive use.

The first four words in my original post were "I had a script...". 'Not that I mind the extraneous information concerning the broader merits of wget vs curl, but I am trying to solve a specific problem here, after all.


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/ca4e92b6-7e4a-4aae...@googlegroups.com

Bob Proulx

unread,
Jun 8, 2015, 6:30:06 PM6/8/15
to
Leslie Rhorer wrote:
> Reco wrote:
> > Don't depend on curl. Use good old socat combined with wget:

That use of socat was clever. I didn't like the pkill socat though.
Wouldn't be good if there were another one running at the same time.

> Why? The -L option in curl did the trick. Is there some
> over-riding reason why I should use wget instead of curl? Curl
> windsup being simpler and faster in this case.

Some time ago Reco and I were discussing this and Reco noted that curl
uses openssl while wget uses gnutls. That was Reco's reason for
prefering curl over wget at that time.

https://lists.debian.org/20150409082351.GA24040@x101h

That is a difference that causes problems and is probably related to
the problem you just experienced. This doesn't play into plain http
connections. But for https gnutls and openssl have different
behavior. They shouldn't. But they do. Therefore the suggestion
away from wget is really a suggestion away from gnutls and over to the
more time proven openssl used by curl.

However recently in Unstable and therefore Stretch Testing curl
changed from openssl over to gnutls too. Jessie is already released
and this doesn't change Jessie or earlier releases.

http://bugs.debian.org/342719

So in the future if that is maintained both curl and wget will have
the same gnutls behavior. Which might be different behavior from web
browsers as most web browsers use openssl. I am not sure which side
to cheer for as I would like both to work. Security benefits from
having independent implementations.

Bob
signature.asc

David Wright

unread,
Jun 8, 2015, 9:30:04 PM6/8/15
to
Quoting Leslie Rhorer (lrh...@mygrande.net):
> On Sunday, June 7, 2015 at 7:10:03 PM UTC-5, David Wright wrote:
> > Quoting Leslie Rhorer:
> > > On Sunday, June 7, 2015 at 4:00:04 AM UTC-5, Reco wrote:
> > >
> > > > > Does anyone have any ideas how I could get curl to handle the task, since wget is failing? Some other utility?
> > > >
> > > > Don't depend on curl. Use good old socat combined with wget:
> > >
> > > Why? The -L option in curl did the trick. Is there some over-riding reason why I should use wget instead of curl? Curl windsup being simpler and faster in this case.
> >
> > I'm not commenting on this particular case, but the default options in
> > curl are a pain in the proverbial.
>
> I was asking about this case.

Of course you were. That's why you posted. And that's why I preceded
my post with "I'm not commenting on this particular case". Perhaps I
should have started with

\begin{Oblique opinions on the default options in curl compared with
the behaviour of wget}

> > As I mentioned 8 April, curl outputs to stdout so you've got to set
> > -O to get the "correct" filename.
>
> Since it's easier in this case *NOT* to have the output sent to a file (or worse, a directory structure), curl is easier.

wget -O - if so required; therefore irrelevant in a script.

> > Then you need -R to get the correct timestamp applied.
>
> I don't care about the timestamp.

Irrelevant for standard output.

> > You also need to check for the existence of a file of the same name
> > else curl will silently overwrite it. I haven't figured out an alias
> > to prevent this.
>
> Since I *want* the file overwritten every time the script runs, this isn't a problem.

Irrelevant for standard output.

> Indeed, it is preferred.

Why, if it's irrelevant to you?

The default action of cp, for example, is overwriting. That's why I
and many others spell cp as cp -i. But curl appears to have no
equivalent of the -i switch, which I think is a great failing, not
something to be preferred! I say this hoping someone will contradict me.

> That, plus I don't have curl write to the file directly. I pipe the result to grep and then sed, and then redirect that output to the file.

...which has more to do with bash's clobber than the behaviour of
either wget or curl, which is at the other end of the pipe and
therefore has no knowledge of any output file.

> One line with curl. It was over a dozen with wget.

So?

> > wget handles these cases correctly. curl might be fine for scripting
> > but I find wget far friendlier for interactive use.
>
> The first four words in my original post were "I had a script...".

...and that line was ~680 characters long. Please wrap.

> 'Not that I mind the extraneous information concerning the broader merits of wget vs curl,

Phew.

> but I am trying to solve a specific problem here, after all.

...and I hope I haven't stood in the way of others helping you.

Cheers,
David.


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/20150609012007.GB18474@alum

Reco

unread,
Jun 9, 2015, 6:50:04 AM6/9/15
to
On Mon, Jun 08, 2015 at 04:20:21PM -0600, Bob Proulx wrote:
> Leslie Rhorer wrote:
> > Reco wrote:
> > > Don't depend on curl. Use good old socat combined with wget:
>
> That use of socat was clever. I didn't like the pkill socat though.
> Wouldn't be good if there were another one running at the same time.

Yes, there's a room for an improvement. Presumably socat can write own
pid to a user-specified pidfile, but I was lazy to check a manpage.


> > Why? The -L option in curl did the trick. Is there some
> > over-riding reason why I should use wget instead of curl? Curl
> > windsup being simpler and faster in this case.
>
> Some time ago Reco and I were discussing this and Reco noted that curl
> uses openssl while wget uses gnutls. That was Reco's reason for
> prefering curl over wget at that time.
>
> https://lists.debian.org/20150409082351.GA24040@x101h

And as the current discussion shows - those reasons are still valid.


> That is a difference that causes problems and is probably related to
> the problem you just experienced. This doesn't play into plain http
> connections. But for https gnutls and openssl have different
> behavior. They shouldn't. But they do. Therefore the suggestion
> away from wget is really a suggestion away from gnutls and over to the
> more time proven openssl used by curl.
>
> However recently in Unstable and therefore Stretch Testing curl
> changed from openssl over to gnutls too. Jessie is already released
> and this doesn't change Jessie or earlier releases.
>
> http://bugs.debian.org/342719
>
> So in the future if that is maintained both curl and wget will have
> the same gnutls behavior.
>
> Which might be different behavior from web
> browsers as most web browsers use openssl.

A minor nitpick here.

Iceweasel/Firefox use libnss, not openssl.
Chrome/Chromium use libnss.
Anything based on webkit-gtk actually uses gnutls.
I'm unsure about webkit-qt, though.

About the only browser that actually uses openssl I can remember is w3m.

Reco


--
To UNSUBSCRIBE, email to debian-us...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listm...@lists.debian.org
Archive: https://lists.debian.org/20150609104537.GA20901@x101h

Bob Proulx

unread,
Jun 12, 2015, 5:00:04 PM6/12/15
to
Reco wrote:
> Bob Proulx wrote:
> > That use of socat was clever. I didn't like the pkill socat though.
> > Wouldn't be good if there were another one running at the same time.
>
> Yes, there's a room for an improvement. Presumably socat can write own
> pid to a user-specified pidfile, but I was lazy to check a manpage.

I don't think socat does. But one can use start-stop-daemon to manage
things for you. The /etc/init.d/rsync file contains an example of
doing such using --make-pidfile and so forth.

> > Some time ago Reco and I were discussing this and Reco noted that curl
> > uses openssl while wget uses gnutls. That was Reco's reason for
> > prefering curl over wget at that time.
> >
> > https://lists.debian.org/20150409082351.GA24040@x101h
>
> And as the current discussion shows - those reasons are still valid.

Yes. I was just keeping neutral in the debate. I note the problem,
and agree it is a problem, and hope that gnutls improves.

My own problem with gnutls is that it seems it requires *all* of the
certificate chains to verify valid instead of *any* of them. Meaning
that some sites that only include a valid certificate chain for one
path but have at least one path not fully valid will fail the wget
gnutls test but will work with a web browser and (apparently) libnss.
That isn't nice either.

> > Which might be different behavior from web
> > browsers as most web browsers use openssl.
>
> A minor nitpick here.
>
> Iceweasel/Firefox use libnss, not openssl.
> Chrome/Chromium use libnss.
> Anything based on webkit-gtk actually uses gnutls.
> I'm unsure about webkit-qt, though.
>
> About the only browser that actually uses openssl I can remember is w3m.

Good update. I hadn't internalized that the web browsers used libnss
instead of openssl. Thanks!

Bob
signature.asc
0 new messages