2007-03-16 20:20:36 +01:00
|
|
|
If the user interrupts the page loading during the running of `svn commit`,
|
|
|
|
the repository will be left in an inconsistent state. The probability of
|
|
|
|
this happening increases with the size of the repository and the number of
|
|
|
|
plugins installed, because these both affect how long the post-commit hook
|
|
|
|
takes to run. (The core issue, I guess, is that we're abusing the concept
|
|
|
|
of a "working copy" by giving everybody the same one). Here are the main
|
|
|
|
solutions that I can see: (1) CGI queues commits so that a single process
|
|
|
|
can act upon them sequentially, or (2) optionally divorce the `ikiwiki
|
|
|
|
--refresh` from the `svn commit` so that commits happen faster. -- [[Ben]]
|
|
|
|
|
|
|
|
I'm not aware of web servers, at least apache, killing cgi processes when
|
|
|
|
the user stops a page load. If this is happening ikiwiki should be able to
|
|
|
|
avoid it by blocking whatever signal is causing it to terminate. --[[Joey]]
|
2010-09-08 23:24:52 +02:00
|
|
|
|
|
|
|
Just as an experiment, I tried running ikiwiki using a *remote* repository,
|
|
|
|
i.e. via "svn+ssh". After setting up the repo and relocating the working copy,
|
|
|
|
unfortunately, it doesn't work; editing a page gives the error:
|
|
|
|
|
|
|
|
Error: no element found at line 3, column 0, byte 28 at /opt/local/lib/perl5/vendor_perl/5.10.1/darwin-multi-2level/XML/Parser.pm line 187
|
|
|
|
|
|
|
|
I *think* this is because, despite a SetEnv directive in the apache configuration,
|
|
|
|
the CGI wrapper is expunging SVN_SSH from the environment (based on perusing
|
|
|
|
the source of Wrapper.pm and looking at "envsave" there at the top).
|
|
|
|
Is this the case? --Glenn
|