http://nilx.myopenid.com/ 2011-11-08 23:04:47 -04:00 committed by admin
parent 0f7b49b9bf
commit 0c46805fa2
1 changed files with 8 additions and 0 deletions

View File

@ -0,0 +1,8 @@
[[!tag wishlist]]
[My ikiwiki instance](http://www.ipol.im/) is quite heavy. 674M of data in the source repo, 1.1G in its .git folder.
Lots of \[[!img ]] (~2200), lots of \[[!teximg ]] (~2700). A complete rebuild takes 10 minutes.
We could use a big machine, with plenty of CPUs. Could some multi-threading support be added to ikiwiki, by forking out all the external heavy plugins (imagemagick, tex, ...) and/or by processing pages in parallel?
Disclaimer: I know nothing of the Perl approach to parallel processing.