I need to download certain files for my application. HttpQueryInfo() fails on two finals with 12150 (ERROR_HTTP_HEADER_NOT_FOUND).
I can still download the file, but InternetReadFile() doesn't always download the entire file if I don't have the EXACT filesize. It's very strange. There must be something I'm doing wrone.
Here's the function that I use to grab the files:
Code:
std::string Internet::getURL(const char *URL)
{
//get size of file
std::string temp = "";
hInternet = InternetOpen( "GINA: Version 0.1", INTERNET_OPEN_TYPE_DIRECT, NULL, 0,0);
hFile = InternetOpenUrl(hInternet, URL, NULL, 0, 0, 0);
DWORD sizeBuffer;
DWORD length = sizeof(sizeBuffer);
char *buffer = NULL;
bool succeeds = HttpQueryInfo(hFile, HTTP_QUERY_CONTENT_LENGTH | HTTP_QUERY_FLAG_NUMBER, (LPVOID)&sizeBuffer, &length, NULL);
if (!succeeds)
{//query fails at /maps/
int i = GetLastError();
char buf[25];
MessageBox(NULL,itoa(i,buf,10),"Error",0);
//MessageBox(NULL,URL,"URL Failed Query",0);
sizeBuffer = 65536; //read everything damn it
}
buffer = new char[sizeBuffer];
//get file
DWORD bytesRead = 0;
DWORD bytesToRead = sizeBuffer;
if (hFile)
{
do
{
InternetReadFile(hFile, buffer, bytesToRead, &bytesRead);
}while (bytesRead != 0);
}
else
throw "Unable to grab File";
temp = buffer;
InternetCloseHandle(hFile);
InternetCloseHandle(hInternet);
return temp;
}
The part where I set sizeBuffer to 65536 is my workaround that lets me download the files that fail with error code 12150.
The URLS that fail (all others work fine with this function) are:
Code:
http://www.imperialconflict.com/maps/
and: http://www.imperialconflict.com/rankings.php?type=topfamilies_score&g=11
They are only partially retrieved with the above function. Sometimes they are not even retrieved continuously.
hInternet,hFile are members of my class Internet
Thanks for any help