I'm using the WinInet API to make an HTTP request of a remote servlet
from a Win32 application. To avoid blocking the main GUI thread, I've
created a worker thread to make the WinInet calls (the worker thread
calls WinInet synchronously). The worker thread calls back to the main
thread when all of its work has completed successfully.
Qualitatively, all is working fine (i.e., the main thread is not blocked
while the worker thread is executing, the callback is called
successfully, and the Win32 application gets the data it needs).
However, the main thread takes a severe performance hit during the time
that the worker thread makes calls to HttpSendRequest (to send the
request to the servlet) and InternetReadFile (to receive the response
data from the servlet). These two functions appear to be using the CPU
at the expense of all other executing threads (including the main GUI
thread of my application; are they doing some sort of repetitive polling
across the wire?). I've tried setting the thread priority of the worker
thread to THREAD_PRIORITY_LOWEST before launching the thread, but with
no apparent effect.
Any ideas as to how to throttle the CPU usage of these WinInet calls?
One idea that I'm currently working on is to make the WinInet calls
asynchronously (perhaps the async WinInet functions are not so
greedy?). However, I'm running into a basic problem in my calls to
HttpSendRequestEx. The relevant (C++) source code is as follows:
hRequest = ::HttpOpenRequest(s_hConnect,
_T("GET"),
requestURI.c_str(),
_T("HTTP/1.0"),
NULL,
NULL,
INTERNET_FLAG_RELOAD |
INTERNET_FLAG_NO_AUTO_REDIRECT,
0);
if ( hRequest == static_cast<HINTERNET>(NULL) )
throw MyException(_T("Failed call to HttpOpenRequest"));
INTERNET_BUFFERS inetBufferIn;
::ZeroMemory(&inetBufferIn, sizeof(inetBufferIn));
inetBufferIn.dwStructSize = sizeof(inetBufferIn);
INTERNET_BUFFERS inetBufferOut;
::ZeroMemory(&inetBufferOut, sizeof(inetBufferOut));
inetBufferOut.dwStructSize = sizeof(inetBufferOut);
if ( !::HttpSendRequestEx(hRequest, &inetBufferIn, &inetBufferOut,
HSR_ASYNC, s_wininetHSRContext) )
{
if ( ::GetLastError() != ERROR_IO_PENDING )
{
throw MyException(_T("Failed call to HttpSendRequest"));
}
else
{
DWORD waitResult = ::WaitForSingleObject(s_hWaitEvent, 30000L);
if ( waitResult != WAIT_OBJECT_0 )
{
throw MyException(_T("Failed to send HTTP request to Server for
Cookie & Location URL"));
}
}
}
The call to HttpSendRequestEx is returning FALSE with an error code of
87 ("The parameter is incorrect"). I'm assuming that my use of the
INTERNET_BUFFERS structs is incorrect, but, the MSDN documentation for
the use of these structs in HttpSendRequestEx is rather scant.
Any enlightenment provided would be most appreciated.
Thank you.
- Andy Scheiner
I don't have a solution for your 'HttpSendRequestEx' problem, but I would
suggest to not use 'HttpSendRequestEx' to solve the performance problem
you've noted.
I've written a similar program with multiple threads:
1 thread for the GUI
1 thread for a "thread scheduler" for the "worker threads"
1-16 threads for the "worker threads" which are downloading the URLs
All threads are using Wininet in *synchronous* mode only!
All threads are running with default (standard) priority!
I am developing and using that program since 3(!) years. I've started
developing it with Wininet v1.0 (MSIE 3.0) and now I am using it with
Wininet v1.2 (MSIE 5.01) and I've *never* seen any performance issues with
'HttpSendRequest' or 'InternReadFile'.
Furthermore, I've tested that program under Win95 (16MB, 486), Win98 (64MB,
PII), WinNT (64MB, PII) and experimented with different thread priorities;
and again I've never seen any (unexpected) performance problems.
I wrote 'unexpected' because -- there is of course a performance problem --
if the worker threads are set to a higher priority than the GUI thread and
the thread scheduler thread and have a lot of work to do. That noticeable
performance problem was only observed under WinNT. Anyway, there's no need
to change the priority of the threads -- I just wanted to test it.
Even my "stress" tests didn't show up a performance issue. A "stress" test
for my program is to concurrently download 16 large URLs from a HTTP/FTP
server which is running on the *same* machine! There should be enough
traffic on the TCP/IP stack if the server is running on the same machine as
the client to show any bottlenecks.
Hmmm, on the other side -- I've never done this "stress" test over the
wire... In practice I am requesting 4-6 URLs concurrently over a ca. 1.4
MByte/min connection.
I just wanted to tell you that maybe you've some other type of problem:
- multithreaded problem?
- polling?
- lot of work in your status callback?
Regards,
Markus
"Andrew Scheiner" <asch...@ea.com> schrieb im Newsbeitrag
news:39BE7DB1...@ea.com...