W3C home > Mailing lists > Public > www-lib@w3.org > January to March 2000

Newbie: Memory leak problem

From: Petter Gustad <pegu@dolphinics.no>
Date: Fri, 21 Jan 2000 17:23:06 +0100
Message-Id: <200001211623.RAA22436@scintight.dolphinics.no>
To: www-lib@w3.org

I've been using libwww for some simple things. Now I need to monitor a
web page continuously. I tried to write something based on the chunk.c
code, but there seem to be a memory leakage somewhere. Is there a
routine I need to call in order to free up memory allocated by
HTChunk_data or HTLoadToChunk other than HTChunk_delete?

If I run the program below I can see that it's memory segment is
growing:


#include <stdio.h>
#include <WWWLib.h>
#include <WWWApp.h>

int main() {
  HTChunk * chunk;
  HTRequest * request;
  char* url = "http://www.dolphinics.no";
  char * cwd;
  char * absolute_url;
  char* string;
  int i;

  printf("hello libwww world\n");

  request = HTRequest_new();
  HTProfile_newPreemptiveClient("hello", "1.0");

  /* We want raw output including headers */
  HTRequest_setOutputFormat(request,WWW_RAW);

  /* Close connection immediately */
  HTRequest_addConnection(request,"close","");

  cwd = HTGetCurrentDirectoryURL();
  absolute_url = HTParse(url, cwd, PARSE_ALL);

  for (i = 0; 1 ;i++) {
    printf("iteration number %d\n",i);

    if (url) {
      chunk = HTLoadToChunk(absolute_url,request);
      if (chunk) {
        string = HTChunk_data(chunk);
        HTChunk_delete(chunk);
      }
    }

    HTRequest_clear (request);
  }

  HT_FREE(absolute_url);
  HT_FREE(cwd);

  HTRequest_delete(request);
    
  HTProfile_delete();

  return 0;
}


Petter
Received on Friday, 21 January 2000 10:22:33 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 23 April 2007 18:18:35 GMT