- From: Gordon Haupt <GHaupt@getmedia.com>
- Date: Tue, 14 Mar 2000 13:59:36 -0800
- To: www-lib@w3.org
I am having a problem with what appears to be a memory leak using the
HTProfile function. Using the Linux top command, I was able to see an
increase in the size of the program with each call of the class function
listed below. If I comment out the HTProfile_newNoCacheClient and
HTProfile_delete() commands, the memory leak goes away. In addition, I
commented out everything but those two commands, and the memory increase
persisted. It appears that the HTProfile_delete() command is not completely
cleaning up the memory, unless I'm missing something? Any help would be
greatly appreciated.
Thanks!
Gordon Haupt
String wwwASPInet::getData(String query)
{
if (_source_url.Length() == 0)
{
_error = "Error: URL not set.\n";
_errid = -3002;
return "";
} // end if - _source_url
HTRequest * request = HTRequest_new();
HTChunk * chunk = NULL;
//concatenate the URL and the query.
String url = _source_url + "?" + query;
//HTProfile_newClient("wwwnet", "1.0"); // tried this one too
HTProfile_newNoCacheClient("wwwnet", "1.0");
HTRequest_setOutputFormat(request, WWW_SOURCE);
HTRequest_setPreemptive(request, YES);
HTAlert_setInteractive(NO);
char * cwd = HTGetCurrentDirectoryURL();
char * absolute_url = HTParse(url.itsData, cwd, PARSE_ALL);
HTAnchor * anchor = HTAnchor_findAddress(absolute_url);
// this query gets the data
chunk = HTLoadAnchorToChunk(anchor, request);
if (HTChunk_data(chunk) == NULL)
{
_error = "Error: Could not get CONTENT_LENGTH information";
_errid = -3003;
cout << "here" << endl;
HT_FREE(absolute_url);
HT_FREE(cwd);
HTRequest_delete(request);
HTProfile_delete();
return "";
} // end if - QueryInfo
String message;
message = HTChunk_toCString(chunk);
HT_FREE(absolute_url);
HT_FREE(cwd);
HTRequest_delete(request);
HTProfile_delete();
return message;
} // end - getData
Received on Tuesday, 14 March 2000 17:00:27 UTC