Forum:Content dumps

From RationalWiki
Jump to navigation Jump to search

As requested, I am going to start automating content dumps for the site, see: RationalWiki:Content dumps for details. Full revision histories aren't feasible so its just the current version of each page. Which should be perfectly adequate for forks, which is what this is most usable for. The biggest content whole is the lack of images. But there we go. If there are any thoughts/suggestions/complaints let me know. tmtoulouse 21:45, 17 May 2010 (UTC)

Image dumps[edit]

Is there in interest in setting up an image dump as well? I would be willing to setup a torrent based distribution of RW images, but would need people willing to see the torrent to save money/bandwidth. tmtoulouse 01:32, 18 May 2010 (UTC)

How much space would an image dump take up? Star of David.png Radioactive afikomen Please ignore all my awful pre-2014 comments. 01:42, 18 May 2010 (UTC)
Around a gigabyte at the moment, though it would grow over time. tmtoulouse 01:49, 18 May 2010 (UTC)
Only a gig or two? I would be more than willing to seed an image dump, then. Star of David.png Radioactive afikomen Please ignore all my awful pre-2014 comments. 02:00, 18 May 2010 (UTC)
Aye, an actual "snap shot" of RW in terms of current content and images really isn't that big. The massively expanding size requirements comes from the nature of the wiki, where every version of every thing is stored, even if deleted. So 50 megs of text turn into 15 gigabytes, and a gig of images turned into 5. There is really no point in doing it "just to do it" but if there is an interest in it I will set something up. tmtoulouse 02:04, 18 May 2010 (UTC)
I'll seed from S3. Just need to add ?torrent to the end of the web URL :) CrundyTalk nerdy to me 15:08, 21 May 2010 (UTC)

Hi. Please, provide a full history dump and an image dump. You can use 7zip to compress the .xml, it works very well with text. I'm working in a repository of wiki backups, and a full dump of RationalWiki would be nice, it is a great wiki. Thanks. Emijrp (talk) 20:21, 14 September 2011 (UTC)

The full history dump is simply too large, even compressed. It's simply not feasible. But we do have off-site backups. -- Nx / talk 06:44, 15 September 2011 (UTC)
Hm, I just downloaded the openstreetmap wiki dump from your site to see how well 7zip worked, and I was quite surprised. But that compression would probably take a lot of CPU time. -- Nx / talk 06:47, 15 September 2011 (UTC)
I have looked into generating full history dumps a few times. The main issues are that its fairly intensive to generate and does effect wiki performance while its occurring, and a full history dump would take a very long time to do. Really the only way that it would work is if we were working under a multi-machine infrastructure. Tmtoulouse (talk) 09:52, 15 September 2011 (UTC)
How many hours take a full history dump? You can check statistics about the periods with less visits (Sunday night?) and launch it there. Yes, 7zip is very nice, it zips a 7.5GB file to only 65MB. I think that you can generate the dump and compress on the fly with --output parameter[1]. Regards. Emijrp (talk) 19:51, 16 September 2011 (UTC)