sh0dan // VoxPod

Wednesday, November 17, 2010

Updated Wikipedia mDict conversions

As you may have noticed, I have not spent any time on the wikipedia mDict conversions, since I'm very busy with Rawstudio. A very friendly user called "Or" has converted the lastest wikipedia dumps. Here is his message:

---

Thanks for sharing the source code. An update, based on your work, for October 2010 version is available in: http://ahuv.net/wikipedia


Links to torrents:

English: 3,483,000 items - Here
Spanish: 1,361,000 items - Here
French: 1,239,000 items - Here
German: 1,033,000 items - Here
Russian: 720,000 items - Here
Portuguese: 681,000 items - Here
Hebrew: 111,000 items
Arabic: 183,000 items
Persian: 109,000 items

8 Comments:

  • This comment has been removed by the author.

    By Blogger n3yron, at 9:54 pm  

  • Hi, I downloaded the Russian Wikipedia, and the problem is not search on Russian language (Cyrillic). Although the English are fine. You do not know how to solve?

    By Blogger n3yron, at 9:55 pm  

  • I haven't made them.

    But I think I remember that I had to convert the text file for search to work somehow. Can't really remember anymore.

    By Blogger Klaus Post, at 8:52 am  

  • Many thanks, Sh0dan, for your Wikipedia conversions and links.

    I bought a bigger memory card to accommodate the 2.7GB one and now there's a 3.7 GB one - I need to get a bigger card !

    Dave

    By Anonymous Anonymous, at 4:56 pm  

  • @Klaus & @Ahuv
    Finally, great job...

    I want to test the hebrew one..
    In my experience with the program, usually UTF-8 characters is not showed up in the text output..

    or may be it was just me..

    Andre

    By Anonymous Anonymous, at 12:24 pm  

  • @Klaus & @Ahuv
    Hi, thank you so much for the Wikipedia dump which is an outstanding piece of software. I use the fr-wiki-max.mdx release of 12nov2010, but there is an issue with the display of numbers in many articles. If you search 'Paris', the population number for instance does not appear, whereas it does with the 'en-wiki-medium-oct10.mdx' release of 16nov2010. Any idea where that issue can come from ? Both 'fr' and 'en' wikipedia websites display normally all the numbers. Otherwise, wikipedia dump is an excellent tool.

    By Blogger Stephane, at 4:15 pm  

  • hello sh0dan
    Is there any tutorial for making wikipedia dumps for mdict?i searched all over the net but didn't find anything.if you could help me with this,that would be the greatest help.thank you
    and sorry for my english

    By Anonymous amir, at 12:43 am  

  • Hello,
    I see some links for "Turkish Wikipedia" at http://ahuv.net/wikipedia/. Somehow, I can not download it. Can you help?

    By Anonymous Anonymous, at 10:17 am  

Post a Comment

<< Home