Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

ReadToEnd hangs on stream from HttpWebResponse

1,591 views
Skip to first unread message

Robert Wilkinson

unread,
Feb 21, 2003, 2:46:26 PM2/21/03
to
Apologies for rasing this subject again, but I don't believe there has
been a resolution.

I've contacted people by Email that previously posted the problem -
and they haven't been able to solve it.

The code is like this - and the problem is that with some responses
the code will hang at the line html = sr.ReadToEnd();

----------- Start of code extract -----------------------

HttpWebRequest rq = (HttpWebRequest) HttpWebRequest.Create(url);

// Set up headers and data etc

HttpWebResponse rp;
string html;
try
{
Encoding encode = System.Text.Encoding.GetEncoding("iso-8859-1");
rp = (HttpWebResponse) rq.GetResponse();
StreamReader sr = new StreamReader (rp.GetResponseStream(),
encode);
html = sr.ReadToEnd();
sr.Close();
rp.Close();
}
catch(WebException wex)
{
//Find out what's happened.
}

------------- End of code extract ---------------------------------

The problem is almost certainly not with the request itself - I can
insert code before html = sr.ReadToEnd(); to read a few characters
from the stream, and they are all correct.

Any suggestions?
Robert Wilkinson
Toll Free 866 787 1223
http://abacus24-7.com
Best value in Ink Jet Cartridges

Joerg Jooss

unread,
Feb 23, 2003, 5:55:56 AM2/23/03
to
Robert Wilkinson spoke:

> Apologies for rasing this subject again, but I don't believe there
> has been a resolution.
>
> I've contacted people by Email that previously posted the problem -
> and they haven't been able to solve it.
>
> The code is like this - and the problem is that with some responses
> the code will hang at the line html = sr.ReadToEnd();

[...]

> The problem is almost certainly not with the request itself - I can
> insert code before html = sr.ReadToEnd(); to read a few characters
> from the stream, and they are all correct.
>
> Any suggestions?

Robert,

what about trying to receive the entire response as bytes first and then
decode it?

--
Joerg Jooss
joerg...@gmx.net

Robert Wilkinson

unread,
Feb 24, 2003, 11:23:06 AM2/24/03
to

Thanks for the response. I had the same thought, so I tried this code
- not efficient - just for debug

string result = "";
while (sr.Peek() > 0)
{
int thisInt = sr.Read();
result += System.Convert.ToChar(thisInt);
}

That brings up another problem. sr.Peek returns -1 before I have got
all the message (after about 1420 bytes). I can see this if I access
a resource where readToEnd doesn't hang, because I can foloow this
debug code with a readToEnd, and it will get the rest of the message.

Did you have another suggestion about how to read all the bytes in the
message before decoding (This is in fact what I would like to do for
other reasons)

Thanks again,

Robert/
>
>--
>Joerg Jooss
>joerg...@gmx.net

Robert Wilkinson

unread,
Feb 25, 2003, 11:24:37 AM2/25/03
to
On Mon, 24 Feb 2003 16:23:06 GMT, rob...@newsguy.com (Robert
Wilkinson) wrote:

I activated a Network Monitor, and I see that the problem is that my
machine is not getting all the page that it is calling. To get a
definitive resolution I'd need to dig further into the protocol.

However, I am sitting behing a NetScreen box, and I have had other
suspicions that it will cause problems on big transfers.

Also, to really resolve the issue I think I will need to put a network
monitor on both sides of the Netscreen box - and I can't do that at
the moment.

I have solved the problem - at least temporarily - by reducing the
size of the page that is returned from the HttpRequest (good job that
I can do that in my application!)

Any further input or suggestions welcomed.

However, it clearly is a problem that ReadToEnd hangs, without a
timeout etc. (Wonder if I could wrap it in its own thread, and then
kill the thread on a timeout.... or what?)

Joerg Jooss

unread,
Mar 1, 2003, 6:10:24 AM3/1/03
to
Robert Wilkinson spoke:

Get rid of the StreamReader -- this will decode the response regardless
of how many bytes you read at a time. Use Streams instead:

// Create your request...

// Receive response
HttpWebResponse response = (HttpWebResponse) request.GetResponse();

// Use 64 kB response buffer
MemoryStream ostream = new MemoryStream(0x10000);

// This will track the actual content length
long contentLength = 0;

using (Stream istream = request.GetResponse().GetResponseStream())
{
// Use 4kB read buffer
byte[] buffer = new byte[0x1000];

int bytes;
while ((bytes = istream.Read(buffer, 0, buffer.Length)) > 0)
{
ostream.Write(buffer, 0, bytes);
contentLength += bytes;
}
}

bytes[] responseBytes = ostream.ToArray();

// Decode responseBytes...

Compare the computed (actual) contentLength to the value provided in the
Content-Length header. This should be the root of all evil...

Note that if the response's Content-Length header is correct, you can
read the response like this, avoiding the loop:

byte[] buffer = new byte[httpResponse.ContentLength];
istream.Read(buffer, 0, buffer.Length);

Cheers,
--
Joerg Jooss
joerg...@gmx.net

dazh...@gmail.com

unread,
Jan 22, 2018, 3:02:17 AM1/22/18
to
在 2003年2月25日星期二 UTC+8上午12:23:06,Robert Wilkinson写道:
great,thanks a lot,appreciate cross the GFW
0 new messages