Allright, getting back to wwwoffle - I tried installing it and as I mentioned - it gave a segmentation fault during install (GUI or command line)
But neverthless when I did go into /etc and other directories I saw the wwwoffle binaries and the wwwoffle.conf file in the correct locations and then /etc/rc.d/init.d/wwwoffle start did start up the daemon wwwoffled, which shows in the ps -ef output.
So I fired up my wireless connection,
went to opera and did the
http://localhost.com/8080/control/ page
and setup up Slashdot.org and news.bbc.co.uk (I dont rely on US news media for news)
as the pages to fetch - 'recursively for 1 level down'
did a wwwoffle -fetch from command line as well as the control page.
My /var/cache/wwwoffle is getting filled up with many pages etc.
Fine till now.
So I do a wwwoffline -offline, close down my wifi connection and do a
http://localhost.com/8080/#indexes to see the indexes
I see the indexs for Slashdot and BBC, when I click on them I go to the respective index for each site, but when I click on the site link, it tries to connect to
http://www.slahdot.org or to
http://news.bbc.co.uk and hence asks for a network connection
What gives ?
How am I supposed to read the files offline ?
I am a newbie to wwwoffle, so need some pointers as to how to really set this up to read the pages offline.
Note : all this is on my 6000L. I dont like the solution of getting things on the desktop and then sync (or FTP) to the Zaurus. I want to be as disconnected from the desktop as possible.
Thanks in advance.