I'm using URLDownloadToFile to download files from an FTP site.
The files on the site may be updated at any time, from seldom to several times a day.
Usually the function will load the file from cache if there hasn't been any change, or download from the site if the file has changed. This is the way I expected it to work.
Occasionally, the function returns the file in the cache rather than the new one on the FTP site.
I don't want to blindly execute a DeleteUrlCacheEntry call, because the files are quite large, and it's a big waste of time if there hasn't been any change.
Does anyone have any recommendations on how to solve this problem?
the only thought i have, Jim, is to examine the index file
or perhaps - send a request and examine the header that is returned prior to d/l
http://vbnet.mvps.org/index.html?code/internet/urldownloadtofile.htm has a possible solution, but i think DeleteUrlCacheEntry is the only way. Ive been making a little AutoUpdate Library based around the URLDownloadToFile function and have had to use DeleteUrlCacheEntry.
Thanks guys.
Does anyone know the criteria used by the system to determine when to use a cached file and when to download a new version?
To know when to update or not: (3 ways I know of)
Use version number comparison, Ccompare time modfied, or a hash functioning.
Use a hash function, and force a download of a file index that contains file hashes for the files currently on the server. Compare to the local list, and if they're different, re-download.
Best regards,
Astro.
he is asking about the cache - not the files
Yes.
If he uses a hash and stores them in a file, he can figure out if the files on the server are newer and thus whether they need to be downloaded or not.
Hash file:
Server File 1 - Hash - 0x53DA
Local File 1 - Hash - 0x6ACC
= download file.
You could even just store a version number or something.
Best regards,
Astro.
The thing is: access above reading may not be available, so it may not be acceptable to throw in hashing.
FtpGetFileSize: __in HINERNET hFile, __out LPDWORD lpdwFileSizeHigh
GetFileSize: __in HANDLE, __out_opt LPDWORD lpdwFileSizeHigh
Use the top for the remote file, and the bottom for the local file. Then, compare the sizes. If they do not match, simply download the file from the sever to fix the mismatch.
well - the request header has the size in it - maybe even the date and hash
i don't remember all the stuff in a header, and i am too damn lazy to look it up - lol
hi I have problem with FtpGetFileSize.
I am successfully connect to my NET
and when I want to get size of index.html in my NET with the following code:
line1..... invoke FtpOpenFile,hConnect,SADD("public_html/index.html"),GENERIC_READ, FTP_TRANSFER_TYPE_BINARY, NULL
line2.....invoke FtpGetFileSize,eax,NULL
line3.....invoke MessageBox,hWin,str$(eax),SADD("get file size in bytes on site Successfully"),0
it return correct byte size in eax
but after read line3, my program crash (NOT RESPONDING) :dazzled:
I think it return > 50000 in eax that make my program crash
so, I try with the following code:
LOCAL hFile:HANDLE
LOCAL dwFileSize:DWORD
.
.
.
.
line1..... invoke FtpOpenFile,hConnect,SADD("public_html/index.html"),GENERIC_READ, FTP_TRANSFER_TYPE_BINARY, NULL
line2.....mov hFile,eax
line2.....invoke FtpGetFileSize,hFile,addr dwFileSize
line3.....invoke MessageBox,hWin,str$(dwFileSize),SADD("get file size in bytes on site Successfully"),0
but it still crash after show MessageBox.
What's wrong? could someone help me on this part?