Newbie: Memory leak problem

I've been using libwww for some simple things. Now I need to monitor a
web page continuously. I tried to write something based on the chunk.c
code, but there seem to be a memory leakage somewhere. Is there a
routine I need to call in order to free up memory allocated by
HTChunk_data or HTLoadToChunk other than HTChunk_delete?

If I run the program below I can see that it's memory segment is
growing:


#include <stdio.h>
#include <WWWLib.h>
#include <WWWApp.h>

int main() {
  HTChunk * chunk;
  HTRequest * request;
  char* url = "http://www.dolphinics.no";
  char * cwd;
  char * absolute_url;
  char* string;
  int i;

  printf("hello libwww world\n");

  request = HTRequest_new();
  HTProfile_newPreemptiveClient("hello", "1.0");

  /* We want raw output including headers */
  HTRequest_setOutputFormat(request,WWW_RAW);

  /* Close connection immediately */
  HTRequest_addConnection(request,"close","");

  cwd = HTGetCurrentDirectoryURL();
  absolute_url = HTParse(url, cwd, PARSE_ALL);

  for (i = 0; 1 ;i++) {
    printf("iteration number %d\n",i);

    if (url) {
      chunk = HTLoadToChunk(absolute_url,request);
      if (chunk) {
        string = HTChunk_data(chunk);
        HTChunk_delete(chunk);
      }
    }

    HTRequest_clear (request);
  }

  HT_FREE(absolute_url);
  HT_FREE(cwd);

  HTRequest_delete(request);
    
  HTProfile_delete();

  return 0;
}


Petter

Received on Friday, 21 January 2000 10:22:33 UTC