Support and General Use > Plugins/Viewers

wikipedia xml dump to dir script

(1/1)

man:
here is a php script I wrote for the command line:
http://x.hopto.org/wikidump.php
Problem: on a 5G FAT32 filesystem the script cannot create more files: "No space left on device"
But there is only about 600M used... (122029 files, 3585 dirs)

but moving just a little bit of the wiki to my Ipod and using it with rockbox was great.

however, maybe someone finds this helpful.

man

dionoea:
You could split up the dirs in subdirs based on the 2 or 3 first letters of the wikipedia topic name. That'd make it possible to have all the pages.

man:
Thats what the script already does... But it doesnt work either. Is this a limit to the FAT32 filesystem, or is the partition too small?

Chronon:
It should be easy to tell if you simply ran out of space on that partition.  You can write individual files of up to 4GB to FAT32, so I don't think that's the issue.  

cool_walking_:
There are also limits on:

number of files in the partition.
number of files in a folder.
number of files/folders in root directory.

122029 and 3585 don't seem like very "even" numbers to hit upon a limit, but maybe you've hit the "number of file in root directory" limit, which I think is one of the smaller limits (somewhere in the range 128-512, i think).

Navigation

[0] Message Index

Go to full version