#ADD DATABASE TO KIWIX DOWNLOAD#
To download a subset of the database in XML format, such as a specific category or a list of articles see: Special:Export, usage of which is described at Help:Export. Go to Latest Dumps and look out for all the files that have 'pages-meta-history' in their name. Please only download these if you know you can cope with this quantity of data. All revisions, all pages: These files expand to multiple terabytes of text. SQL files for the pages and links are also available. all-titles-in-ns0.gz – Article titles only (with redirects). 2 – Current revisions only, all pages (including talk). 2 – Current revisions only, no talk or user pages this is probably what you want, and is approximately 18 GB compressed (expands to over 78 GB when decompressed). Download the data dump using a BitTorrent client (torrenting has many benefits and reduces server load, saving bandwidth costs). English Wikipedia dumps in SQL and XML: dumps. Dumps from any Wikimedia Foundation project: dumps. Where do I get it? English-language Wikipedia Some of them are mobile applications – see " list of Wikipedia mobile applications". Wikipedia on rockbox: § Wikiviewer for Rockbox.Selected Wikipedia articles as a printed book: Help:Books/Printed books.Selected Wikipedia articles as a PDF, OpenDocument, etc.: Wikipedia:Books.BzReader: § BzReader and MzReader (for Windows).Some of the many ways to read Wikipedia while offline: 12.4 BzReader and MzReader (for Windows).12 Dynamic HTML generation from a local XML database dump.
11 Static HTML tree dumps for mirroring or CD distribution.9.1 Doing Hadoop MapReduce on the Wikipedia current database dump.9 Help to parse dumps for use in scripts.7.2 Doing SQL queries on the current database dump.7 Why not just retrieve data from at runtime?.4 Where are the uploaded files (image, audio, video, etc.)?.