add my own way :)
parent
1c4eeddb78
commit
f57f56fac1
|
@ -6,6 +6,10 @@
|
||||||
|
|
||||||
----
|
----
|
||||||
|
|
||||||
|
I wrote a script that will download all the latest revisions of a mediawiki site. In short, it does a good part of the stuff required for the migration: it downloads the goods (ie. the latest version of every page, automatically) and commits the resulting structure. There's still a good few pieces missing for an actual complete conversion to ikiwiki, but it's a pretty good start. It only talks with mediawiki through HTTP, so no special access is necessary. The downside of that is that it will not attempt to download every revision for performance reasons. The code is here: http://anarcat.ath.cx/software/mediawikigitdump.git/ See header of the file for more details and todos. -- [[users/Anarcat]]
|
||||||
|
|
||||||
|
----
|
||||||
|
|
||||||
The u32 page is excellent, but I wonder if documenting the procedure here
|
The u32 page is excellent, but I wonder if documenting the procedure here
|
||||||
would be worthwhile. Who knows, the remote site might disappear. But also
|
would be worthwhile. Who knows, the remote site might disappear. But also
|
||||||
there are some variations on the approach that might be useful:
|
there are some variations on the approach that might be useful:
|
||||||
|
|
Loading…
Reference in New Issue