I am wondering if somebody can give me a hint about how to make the NTFS
to read
ahead when using memory mapped files. My application is generating a lot
of page faults,
and the file I am accessing is opened with the hint: SEQUENTIAL_SCAN.
Also, buffering and unbuffering flags give me similar results in number
of page faults.
Any help would be appreciated !!!
Thanks in advance,
Luis Rivera
--
Norman Black
Stony Brook Software
To reply via email remove the ".dud" from my address.
Luis F. Rivera wrote in message <35D22D26...@cs.uiuc.edu>...
On Thu, 13 Aug 1998 12:29:14 -0700, "Norman Black"
<nrbla...@ix.netcom.com> wrote:
>If you are going to do sequential reads don't use memory mapped files.
>Memory mapped files are good for random access. The SEQUENTIAL_SCAN
>CreateFile flag will cause the system to do larger read aheads than when the
>system auto detects sequential access. It also causes the file to NOT be
>cached by the cache manager,
If you're talking about NT, this last statement isn't true. It's true
that FILE_FLAG_SEQUENTIAL triples the size of the readahead (from
default 64kB to 192kB). Now if the app asks for (let's say) 1K bytes
and the cache reads 192 kB instead, where do you think the extra 191
kB is read into, if not into the cache???
--- Jamie Hanrahan, Kernel Mode Systems, San Diego CA (j...@cmkrnl.com)
Drivers, internals, networks, applications, and training for VMS and Windows NT
NT kernel driver FAQ, links, and other information: http://www.cmkrnl.com/
Please reply in news, not via e-mail.
Yes, Yes, BUT the second time you read that file it will NOT be in the
cache. I have tested this. I run a program that searches a bunch of source
files for a string and when sequential scan is used the hard disk still
grinds if I do a search again on the same files. If I remove the sequential
scan flag the second search has no hard drive access. I take this to mean
that all of the files were cached.
My conclusion is that when using sequential scan and once you read past a
point in the file, that cached portion of the file is immediately flushed
from the cache. When reading large, like big AVI, files the sequential scan
flag is probably a big boost for overall performance.
This behavior has a great performance impact on my compiler (I write
compilers). This is important when you read certain files many many times.
In C language speak and example is windows.h. In a windows program this is
read in almost every file. Having this file cached will speed up compilation
performance, and yes using sequential scan will likely slow things down when
compiling a 100 or so files. That is a C language analogy of what happens in
my system. I have the compiler read sources with the sequential scan flag
since sources mean nothing to performance and keeps them from occupying
cache space needlessly, and symbol files (equivalent to a precompiled
header) are read without the sequential scan flag.
--
Norman Black
Stony Brook Software
To reply via email remove the ".dud" from my address.
Jamie Hanrahan wrote in message <35d86e76....@nntp.cts.com>...