Beginner problems with libwww and posix threads.
Subject: Beginner problems with libwww and posix threads.
From: Keith Sibson <email@example.com>
Date: Sun, 13 Jul 1997 19:10:03 +0100
From firstname.lastname@example.org Mon Jul 14 05: 41:24 1997
Organization: Glasgow University
X-Authentication-Warning: www10.w3.org: Host vanuata.dcs.gla.ac.uk [18.104.22.168] claimed to be vanuata
X-Mailer: Mozilla 3.01Gold (X11; I; Linux 2.0.27 i586)
I'm having problems using real posix threads with libwww. The
learning curve for the library is very steep, and so I didn't want to
get into libwww pseudo threads and the like since I only want to do
something extremely simple.
Basically I want to set up the library in a standard way, then retrieve
two or more URL texts to chunks in parallel. I did this by calling a
standard client setup, then sparking off two posix threads, each of
which creates a new request and calls loadtochunk, or whatever. The main
program then busy waits for whichever to complete first and displays.
According to (my interpretation) of what documentation there was on
(real) threading, this should be threadsafe. Have I completely
misunderstood? Because what happens is the program just randomly
segments or fails on one or both URLs. My 'code' is not at fault, though
of course I am probably abusing the library. Can anyone help out, what's
the quickest and easiest way to do this?
I'm using Red Hat Linux, and LibWWW 5.1b and compiled enabling reentrant
Thanks in advance,
-= http://www.dcs.gla.ac.uk/~sibsonk =-