I wrote a small encryption program, and created to data buffers (64k) in the data declarations, rather than use VirtualAlloc. Anyways the program runs fine, and encrypts files up to 4G, but when I check it on microsofts performance monitor I am gettting upwards of 4000 hard faults. I read online that this is a paging error, but the data produced by the program has no error. Could this be a result of not setting the buffers to the system paging size.
A "hard fault" is not as bad as it sounds, and is completly normal for what you describe (or any program, for that matter). Paging works opposite of how people assume: everything starts paged out, and is only paged in upon first access. You should expect a hard fault for every 4k chunk of code or data your program touches.
BTW, a 'hard fault' is when the OS has to actually load it from disk. A soft fault, on the other hand, is when the page is not officially active, but still resident in memory.
-r
Don,
From memory its a page guard mechanism that the OS uses to detect page faults but this does not mean you have any errors in your app, as long as you are reading and writing to valid allocated memory your application is fine.
I use GetSystemInfo call, to get my page size, which was 64k. So I increase the buffer size. The fault rate appears to be inversely proportional to the amount of pages the buffer hold.
Thanks for your help