This seemed to be due to the pagetemplate hook calling prerender. I've
observed this making it take *minutes* for the signin page to be displayed.
ltracing ikiwiki showed it was matching pagespecs a lot.
It may be that this is still a speed pain point when rendering pages, not
just for CGI. So more work may be needed here.
Try to avoid a situation in which so many ikiwiki cgi wrapper programs are
running, all waiting on some long-running thing like a site rebuild, that
it prevents the web server from doing anything else. The current approach
only avoids this problem for GET requests; if multiple cgi's run GETs on a
site at the same time, one will display a "please wait" page for a
configurable number of seconds, which then redirects to retry. To enable
this protection, set cgi_overload_delay to the number of seconds to wait.
This is not enabled by default.
bestlink returns '' if no existing page matches a link. This propigated
through inline and other plugins, causing uninitialized value warnings, and
in some cases (when filecheck was enabled) making the whole directive fail.
Skipping the empty results fixes that, but this is papering over another
problem: If the missing page is later added, there is not dependency
information to know that the inline needs to be updated. Perhaps smcv will
fix that later.
strftime is a C function, it does not return decoded utf8.
Several places in ikiwiki manually decoded it, but at least two
forgot to.
Also, strftime might not return even encoded utf8, if LC_TIME is set
to a non-utf8 value. Went ahead and supported decoding whatever encoding
it uses.
The remaining direct calls to strftime() are all ones that first set
LC_TIME=C, in order to get times that are not for human display.
https://rt.cpan.org/Ticket/Display.html?id=74487
Gave up trying to support multiple YAML backends. The XS one requires ugly
manual encoding to get unicode right, and doesn't allow dumping yaml
fragments w/o the yaml header, but at least it doesn't randomly crash
on import like YAML::Mo has started to.
mdwn: Can use the discount markdown library, via the
Text::Markdown::Discount perl module.
This is preferred if available since it's the fastest currently supported
markdown library, speeding up markdown rendering by a factor of 40.
That is to say, when only rendering a lot of markdown, discount is 40x
faster. When building a ikiwiki site, ikiwiki's other overhead gets in the
way, but I still see significant speedups. Building the ikiwiki docwiki
dropped from 62 to 45 seconds, for example.
However, when multimarkdown is enabled, Text::Markdown::Multimarkdown is
still used.
While discount contains some nonstandard markdown extensions,
including tables and footnotes, AFAICS most of them are not
enabled by default in the perl bindings.
I consider sticking to non-extended markdown a desirable thing, since this
is probably not the last markdown engine. In particular, sundown is waiting
in the wings to get packaged and get a perl binding.
----
Reviewing all the showdown extensions, here are the ones that are enabled:
centered paragraphs:
->centered<-
image sizes: [dust mite](http://dust.mite =150x150)
<style>..</style> blocks are eaten. The perl binding does not provide
access to the gathered CSS. This is not legal html anyway, so unlikely
to cause breakage.
We had a weird problem where, after moving to a new, faster server,
"git push" would sometimes fail like this:
Unpacking objects: 100% (3/3), done.
fatal: The remote end hung up unexpectedly
fatal: The remote end hung up unexpectedly
What turned out to be going on was that git-receive-pack was dying due
to an uncaught SIGPIPE. The SIGPIPE occurred when it tried to write to
the pre-receive hook's stdin. The pre-receive hook, in this case, was
able to do all the checks it needed to do without the input, and so did
exit(0) without consuming it.
Apparently that causes a race. Most of the time, git forks the hook,
writes output to the hook, and then the hook runs, ignores it, and exits.
But sometimes, on our new faster server, git forked the hook, and it
ran, and exited, before git got around to writing to it, resulting in
the SIGPIPE.
write(7, "c9f98c67d70a1cfeba382ec27d87644a"..., 100) = -1 EPIPE (Broken
pipe)
--- SIGPIPE (Broken pipe) @ 0 (0) ---
I think git should ignore SIGPIPE when writing to hooks. Otherwise,
hooks may have to go out of their way to consume all input, and as I've
seen, the races when they fail to do this can lurk undiscovered.
I have written to the git mailing list about this.
As a workaround, consume all stdin before exiting.
Also, I let preview mode write real files, rather than using data: uri.
Which is ok these days, since ikiwiki tracks files created during
previewing, and cleans them up later.
In 875d550f12 I for some reason
made $page be changed when creating a discussion page, which
broke the link on the edit page. Changing page seems unnecessary,
so reverted that part of the change.