OESF | ELSI | pdaXrom | OpenZaurus | Zaurus Themes | Community Links | Ibiblio

IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
> Serving Wikipedia Locally On The Z
ShiroiKuma
post Oct 23 2006, 06:17 AM
Post #1





Group: Members
Posts: 902
Joined: 22-May 04
Member No.: 3,385



With the availability of Wikipedia dumps I'm searching for the best way to serve these and browse them in a web browser locally on the Z.

There has been some discussion - recent - of a user using wiki2static.pl to convert Wiki dumps and using a squashfs file image to view them on a Zaurus. However the dump format is new now, XML I think.

So I'd like to ask whether anyone is doing it and what are you using for this. I've seen this which seems like a good choice, since I'm already running LAMP on the Z. However you don't get images...

Any recommendations on what the best way to do it on the Z? Obviously it would have to be a squashfs image in order to fit the data on a card. But how to do it? Like the above post? Convert on a PC to a MySQL format? Anyone doing it this way or another?
Go to the top of the page
 
+Quote Post
stbrock
post Oct 29 2006, 02:43 PM
Post #2





Group: Members
Posts: 149
Joined: 17-April 04
Member No.: 2,879



I would be interested in this also, and believe Meanie said at one point that he was working on this. Perhaps one day soon he will surprise us with a post. wink.gif

Rather than running mediawiki, apache, etc, there is an alternative approach that sounded appealling at first -- converting the data files to compressed html files that could be browsed directly. However, after a little investigation, it seems that there is so much variation in file format and loss of functionality involved that it did not work very well, and the conversion programs I saw didn't seem to be actively maintained.
Go to the top of the page
 
+Quote Post
Jon_J
post Oct 29 2006, 03:31 PM
Post #3





Group: Members
Posts: 1,843
Joined: 31-December 05
From: Illinois USA
Member No.: 8,821



I don't know if this is off topic.
I use Zbedic and after looking at Meanie's page, I downloaded
"en-wikipedia_0.9.5_20050209.dic.dz"
It doesn't have any pictures, just text.
Google should find it for you. It is 412MB and slows down launching of zbedic.
I just put it on my C3100 hard drive in same directory as the dictionary file that I normally use with zbedic.
/hdd3/QtPalmtop/share/zbedic/
It only slows down zbedic if it's enabled in the dictionary selection screen.
I think, I'll try making another copy of zbedic and rename it to zbedic2.
With 2 different copies of zbedic, I could load one with the usual dictionary file for fast lookup, and load the second one with the wikipedia, since it launches so slowly

EDIT: I tried making a second copy of the binary zbedic as zbedic2 and I also put it in a different tab, but if I load the wikipedia into one of them, it's loaded into both of them.
This kind of makes the "Dictionary" function of zbedic usless for a quick word lookup app.
I'm sure this is the reason, I tried the wikipedia once in zbedic and removed it.
I thought having 2 binaries with different names would allow me to run two different instances of zbedic and have one loaded with just the dictionary, and the other loaded with both the wikipedia and dictionary.
I have used this same method of copying and renaming a binary once before, and it still works for qkonsole.
(I have 2 copies of qkonsole in 2 different tabs. One runs in magnified mode, the other runs in "normal" mode)

This post has been edited by Jon_J: Oct 29 2006, 04:07 PM
Attached File(s)
Attached File  zbedic___wikipedia.png ( 70.66K ) Number of downloads: 115
 
Go to the top of the page
 
+Quote Post
desertrat
post Oct 29 2006, 05:37 PM
Post #4





Group: Members
Posts: 742
Joined: 15-October 05
From: Gulag, Siberia
Member No.: 8,322



QUOTE(Jon_J @ Oct 29 2006, 11:31 PM)
I thought having 2 binaries with different names would allow me to run two different instances of zbedic and have one loaded with just the dictionary, and the other loaded with both the wikipedia and dictionary.

Both copies of the binary are sharing the same configuration files. What you need is 2 sets of configuration files and some way of switching between them, rather than 2 binaries.
Go to the top of the page
 
+Quote Post
rafm
post Oct 30 2006, 06:54 AM
Post #5





Group: Members
Posts: 145
Joined: 13-November 04
Member No.: 5,449



QUOTE(Jon_J @ Oct 30 2006, 12:31 AM)
It doesn't have any pictures, just text.
Google should find it for you. It is 412MB and slows down launching of zbedic.
*


The current solution is to to check the "fast load" box for zbedic, if you do not mind having less RAM. The right solution would be implementing "lazy" loading of dictionaries, but I have never had time to do it.

Images for wikipedia would probably take too much space. It is possible to have images in zbedic starting from 1.1, but nobody has so far tried it with Wikipedia.

To have two copies of zbedic, you would probably need to edit "control" files in .ipk. But I do not recommed this solution.
Go to the top of the page
 
+Quote Post
Jon_J
post Oct 30 2006, 09:24 AM
Post #6





Group: Members
Posts: 1,843
Joined: 31-December 05
From: Illinois USA
Member No.: 8,821



I decided to disable wikipedia in the dictionary selector.
Launch Zbedic with just English dictionary - 3½ seconds.
Launch Zbedic with wikipedia & English dictionary - 8½ seconds.

I don't like using fast load.
I disable fast load on any app that I install, that has it enabled.
I'm keeping wikipedia on my HDD, so I can re-select it again if I need it.
Go to the top of the page
 
+Quote Post
stbrock
post Oct 30 2006, 08:37 PM
Post #7





Group: Members
Posts: 149
Joined: 17-April 04
Member No.: 2,879



The wiki2zaurus homepage hasn't been updated in a long time and the latest downloadable version there of the converted wikipedia is quite old. From what I read about changes in format in the wikipedia, it seemed that the old perl scripts would need a good bit of work to process the current wiki files but I could be wrong. Searching wiki2zaurus on this forum gives a couple of recent threads that suggest someone may be working on this or another approach.
Go to the top of the page
 
+Quote Post
Overgauss
post Nov 9 2006, 09:25 PM
Post #8





Group: Members
Posts: 14
Joined: 17-August 05
Member No.: 7,886



Wikipedia needs to be on my Z. NEEDS.
Go to the top of the page
 
+Quote Post
speculatrix
post Nov 30 2006, 04:34 AM
Post #9





Group: Admin
Posts: 3,281
Joined: 29-July 04
From: Cambridge, England
Member No.: 4,149



You don't necessarily need apache installed on the Z to run cgi scripts - check out my posting on security/networking for a shell-script web server which will run a CGI, and take a look in general discussion for my squashfs-oesf-forum-archive.

maybe you can get a minimal mediawiki cgi running with a minimal mysql db backend.
Go to the top of the page
 
+Quote Post
seiichiro0185
post Nov 30 2006, 06:32 AM
Post #10





Group: Members
Posts: 29
Joined: 23-August 05
From: Germany
Member No.: 7,931



a solution for a small wikipedia-specific server would be wikijserver: http://wikijserver.achterliek.de/ (parts in german)

it runs pretty good on my Z with the german "Exzellente Artikel" dump from the site, only problem is there is no dump for the complete wikipedia around, and so far I found no info on how to generate one. But IMHO this is the best solution for a complete wikipedia on the Z given we get a recent dump...

seiichio0185

PS.: for all who want to try it, the wikijserverl_arm_1_4_12_all.ipk doesn't work with jeode or PersonalProfile (on cacko) but the ewe version with the ewe-runtime works without problems.
Go to the top of the page
 
+Quote Post
zi99y
post Dec 3 2006, 12:13 PM
Post #11





Group: Members
Posts: 282
Joined: 9-August 06
Member No.: 10,709



Here is a list of toolkits for converting the data: http://meta.wikimedia.org/wiki/Alternative_parsers

Notably tomeraider is listed which is intended for portable devices but I know nothing that can read them on the Z.

My idea is to translate the XML into portabase format ( http://portabase.sourceforge.net/portabase_xml.html ) although I've no idea how yet.

Does anyone use portabase to know if it would handle a large database?
Go to the top of the page
 
+Quote Post
Tron
post Jun 10 2007, 06:25 AM
Post #12





Group: Members
Posts: 47
Joined: 28-February 06
Member No.: 9,244



Hi all,

just to get this topic back into discussion - I do think that the way wiki2static works (compress and reformat downloaded dump, then use apache with cgi to generate pages and do searches) is the way to go. Installing a complete wiki with mysql to access the database is probably too much for the Zaurus, not only concerning the available storage space.

As I'm not really satisfied with the *bedic variants (neither "q" nor "z"), wouldn't it be great if someone with knowledge of perl had a look at wiki2zaurus? Maybe that someone could get it working again...
Go to the top of the page
 
+Quote Post

Reply to this topicStart new topic
2 User(s) are reading this topic (2 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 28th December 2014 - 07:53 AM