OESF Portables Forum
General Forums => General Discussion => Topic started by: daniel3000 on May 08, 2005, 04:06:31 pm
-
Hello,
is it possible to somehow read the OESF forums offline directly on the Zaurus?
i.e. did someone mirror the forums to an NNTP server or is it possible to subscribe to an email distribution list for the forums?
daniel
-
I've been looking expectantly at oloqt (http://www.hadeco.co.jp/r&d/sl/oloqt/oloqt.htm) for some time, but it's all in Japanese so it would need some work to understand the docs and translate the UI, but from what I could gather this program lets you gather data from mail servers, RSS feeds or just plain web sites and browse them later when you're offline. It seems very powerfull with some advance scripting features, but of course this is mostly blind guesses because the doc is all in Japanese and all I did was capture (and translate with Excite online translator) all the docs I could get, in order to have a look later, which hasn't happened yet...
Maybe this program can be translated and the appropriate script could be written to capture this board to read it offline? For the time being, I use plucker (or more exactly Sunrise on my Windoze laptop) to make a snapshot of some of the board's forums and opie-reader to read the files on the Zaurus. The problem is that the formatting is lost in the process, so reading big threads can become tedious, but it's better than nothing, I guess...
-
Did anybody ever come up with a way to save & read the forums offline? I was playing around with sitescooper (http://www.sitescooper.org/), but I haven't written any of my own site files yet, so it'd be an uphill battle, and pointless if someone has a better method.
thks
-
I haven't tried to do it myself, so I don't know how difficult it would be, but this seems like the kind of task that Plucker (http://www.plkr.org/) was designed for.
Opie Reader will read Plucker format, so once you've got an archive built, just transfer it to your Zaurus and open it with OpieReader.
Good Luck.
David
-
I haven't tried to do it myself, so I don't know how difficult it would be, but this seems like the kind of task that Plucker (http://www.plkr.org/) was designed for.
Opie Reader will read Plucker format, so once you've got an archive built, just transfer it to your Zaurus and open it with OpieReader.
Good Luck.
David
[div align=\"right\"][a href=\"index.php?act=findpost&pid=93243\"][{POST_SNAPBACK}][/a][/div]
JustReader reads Plucker format too.
-
I believe that sitescooper & plucker are basically the same type of tool. Sitescooper actually can convert it's 'scoops' to plucker format with plucker's command line utility. Correct me if I'm wrong, but as far as I can tell, sitescooper has *a lot* more options than plucker, whereas plucker has a nice easy to use gui frontend going for it.
I actually doubt I'll have time to develop anything before I leave on vacation (the reason for this idea in the first place; all I'll have is dialup). but I'll post if I do. thks
-
You could also take a look at Sunrise, which is the successor to JPluckX.
http://laurens.typepad.com/sunrise/ (http://laurens.typepad.com/sunrise/)
Mike.
-
You could also try httrack (using the webhttrack GUI) to mirror a site on your Linux or Windows desktop box, and then either:
-Serve it to your internet-connected Z using apache or another webserver program
-Convert it to Plucker format and read it on the Z
-Copy whichever portions of the site you like to the Z and read it with a standard web browser.
or
-Run httrack/webhttrack on the Zaurus itself and then browse the mirrored site offline
See, httrack creates an exact mirror of the web site's structure on your hard drive. It also supports updating a site with minimal redownloading.
-
See, httrack creates an exact mirror of the web site's structure on your hard drive. It also supports updating a site with minimal redownloading.
[div align=\"right\"][a href=\"index.php?act=findpost&pid=93526\"][{POST_SNAPBACK}][/a][/div]
Heh... let's see how long it takes and how much space it will take to mirror these forums (with nearly 100k posts).
I'm working upgrading the forums which will provide a nice RSS feature for people to enjoy. Hopefully I'll get it all upgraded by next weekend.
-
You could try setting a depth limit of 2 from the main forum page, thus getting only the first page of each topic... You wouldn't need to get archived posts. Or you could set filters. You don't have to grab everything.
But you could if you wanted to .
-
See, httrack creates an exact mirror of the web site's structure on your hard drive. It also supports updating a site with minimal redownloading.[div align=\"right\"][a href=\"index.php?act=findpost&pid=93526\"][{POST_SNAPBACK}][/a][/div]
Heh... let's see how long it takes and how much space it will take to mirror these forums (with nearly 100k posts).
I'm working upgrading the forums which will provide a nice RSS feature for people to enjoy. Hopefully I'll get it all upgraded by next weekend.
[div align=\"right\"][a href=\"index.php?act=findpost&pid=93568\"][{POST_SNAPBACK}][/a][/div]
he he. Just for kicks I tried. hooowee. I don't think I want to store 270mb of oesf on my Z just to be able to read it offline, though
HTTrack Website Copier/3.33 mirror complete in 5 hours 42 minutes 53 seconds : 34118 links scanned, 34380 files written (276559076 bytes overall) [80901484 bytes received at 3932 bytes/sec], 272524021 bytes transfered using HTTP compression in 33771 files, ratio 21%, 3.8 requests per connection
(677 errors, 2619 warnings, 0 messages)
I look forward to trying the rss feed out. thks
-
You could try setting a depth limit of 2 from the main forum page, thus getting only the first page of each topic... You wouldn't need to get archived posts. Or you could set filters. You don't have to grab everything.[div align=\"right\"][a href=\"index.php?act=findpost&pid=93605\"][{POST_SNAPBACK}][/a][/div]
I'm trying a straight depth 2 scoop right now to see how it works. Maybe I'll finally force myself to learn something about sitescooper scripting
-
If you only want to read the new posts, you could try depth one from the "New posts" page of your account -- make sure to give Httrack the cookie used to keep your account information, or your username/password to log in.