While here, mollify http://validator.w3.org/feed/ and
s/dcterms:creator/dc:creator/g, which happens to make rss2email see
and do nice things with authors.
Not sure if this is needed to avoid it trying to run an editor. Probably
there is never a controlling terminal and probably git notices and does
nothing. But I'm just copying what I have in git-annex assistant here.
(Although with a much worse git version comparion, that only really works due
to luck.)
I saw this happen with calendar, when it wanted to update a page, that
had a calendar on it, but the page had just been deleted. This caused
srcfile_stat to crash.
RPC::XML uses ascii as default encoding, we have to tell it to use utf8.
Without this, ikiwiki returns "failed to get response from blogspam server"
every time a non-ascii character is used in a content that needs checking.
I want to write my blog posts in a convenient format (Emacs org mode)
but do not want commenters to be able to use this format for security
reasons. This patch allows to configure which formats are allowed for
writing comments.
Effectively, it restricts the formats enabled with add_plugin to those
mentioned in comments_allowformats. If this is empty, all formats are
allowed, which is the behavior without this patch.
This re-fixes the same bug as 2d5c2f30, but without introducing
malformed HTML in some situations. This is not a very elegant
solution, but it has the advantage of passing more tests.
This makes them easier to debug by showing the structure. Sample output
when $spaces is set to 4 spaces:
<div class='map'>
<ul>
<li>
<a href="../alpha" class="mapparent">alpha</a>
<ul>
<li>
<a href="../alpha/1" class="mapitem">1</a>
</li>
</ul>
</li>
<li>
<a href="../beta" class="mapitem">beta</a>
</li>
</ul>
</div>
Simple podcast feeds didn't have content tags and I made sure to
keep it that way. This may be unnecessarily conservative. Changing
the behavior to include empty content tags might be fine, but I
don't want to think about it right now, I just want my tests to
keep passing!
The new fancy-podcast tests are copy-pasted-edited from the
simple-podcast tests. These tests shall be refactored.
In test, set up the post-commit hook for more realism (and bugs!).
To make wrappers work in test, set PERL5LIB, and allow the wrappee's
path to be overridden. Meta-test that post-commit is really hooked
up by verifying that content is getting generated in destdir.
About the longstanding bug, which as far as I know was harmless:
CVS can't operate outside a srcdir, so we're always setting $CWD.
"local $CWD" restores the previous value when we go out of scope.
Usually that's correct. But if we're removing the last file from a
directory, the post-commit hook will exec in a working directory
that's about to not exist (CVS will prune it).
The fix: chdir() manually in cvs_runcvs(), so we can selectively
not chdir() back.
This seemed to be due to the pagetemplate hook calling prerender. I've
observed this making it take *minutes* for the signin page to be displayed.
ltracing ikiwiki showed it was matching pagespecs a lot.
It may be that this is still a speed pain point when rendering pages, not
just for CGI. So more work may be needed here.
Since trail members are explicitly rebuilt if the information used for
their prev/up/next boxes changes, they don't need another dependency
on the trail itself. (If the trail disappears, it will disappear from
the member's member_to_trails entry, causing a rebuild; so the add_depends
is redundant.)
Similarly, since trail members are explicitly rebuilt if their next
or previous item, or its title, changes, the presence dependencies on the
next and previous items are redundant.
If the title of a trail changes, each member of that trail must be
rebuilt, for its prev/up/next box to reflect the new title.
If the title of a member changes, its next and previous items (if any)
must be rebuilt, for their prev/up/next boxes to reflect the new title.
In the unlikely event that the ordered contents of a trail have changed
without the TRAILS or TRAILLOOP template variables being evaluated
(for instance, all trail directives are removed from a former trail
that uses a custom pagetemplate that doesn't contain TRAILS), we might
get here without having already called prerender.
Try to avoid a situation in which so many ikiwiki cgi wrapper programs are
running, all waiting on some long-running thing like a site rebuild, that
it prevents the web server from doing anything else. The current approach
only avoids this problem for GET requests; if multiple cgi's run GETs on a
site at the same time, one will display a "please wait" page for a
configurable number of seconds, which then redirects to retry. To enable
this protection, set cgi_overload_delay to the number of seconds to wait.
This is not enabled by default.
this simplifies the code, make the configuration more intuitive, at
the cost of making the labels on the layers automatically generated
and therefore less customizable
When set to true, let each mirror's ikiwiki CGI find out the correct target page
url themselves.
This resolves the usecase described on
[[todo/mirrorlist_with_per-mirror_usedirs_settings]].
Signed-off-by: intrigeri <intrigeri@boum.org>
At some point I changed the storage of trail members' membership
and forgot to update this use.
(It turns out to be rather difficult to reach this code, possibly even
impossible: it only applies if a member somehow ceases to match the
trail's pagespec without either the trail or the member changing.)
Normally, needsignin is called when there is a QUERY_STRING, not when a
form is posted. However, it's certianly possible, and should be supported,
to make a form that invokes an ikiwiki action that checks needsignin.
I encountered this when posting ?do=rename&page=foo. The form is displayed
without checking needsignin, for complicated reasons. Posting the form
is when the true authentication happens.
Previously, prune("wiki/srcdir/sandbox/test.mdwn") could delete srcdir
or even wiki, if they happened to be empty. This is rarely what you
want: there's usually some base directory (destdir, srcdir, transientdir
or another subdirectory of wikistatedir) beyond which you do not want to
delete.
Technically, when the user does this, a passwordless account is created
for them. The notify mails include a login url, and once logged in that
way, the user can enter a password to get a regular account (although
one with an annoying username).
This all requires the passwordauth plugin is enabled. A future enhancement
could be to split the passwordless user concept out into a separate plugin.
The plan is to use this for accounts that are created implicitly, as when
a non-logged-in user subscribes to notifyemail. Such an account has no
password, and login can be accomplished by way of a url that is sent to
them in email.
When the user sets a password, the passwordless login token is disabled.
A file may have no git sha1 if it's in the underlay, or just is not checked
into git. This debug message doesn't add any value and is potentially
confusing.
bestlink returns '' if no existing page matches a link. This propigated
through inline and other plugins, causing uninitialized value warnings, and
in some cases (when filecheck was enabled) making the whole directive fail.
Skipping the empty results fixes that, but this is papering over another
problem: If the missing page is later added, there is not dependency
information to know that the inline needs to be updated. Perhaps smcv will
fix that later.
The cgi shows a fullscreen map, so having this other option to do it seems
redundant, and also layering a fullscreen map over an existing wiki page
doesn't look very good to me (and prevents editing the page etc).
This was not set anywhere, which causes their javascript to crash.
It *seems* the idea is this is the url to use to view the map full screen,
which uses ikiwiki.cgi.
* fix will_render calls to pass proper relative filenames
* fix urls to kml etc files to not assume wiki's top is at /
* avoid building the javascript to display the map in two different
ways between the cgi and on-page maps
* refactor duplicate code
This hook involves urlto, and that needs to have state loaded to work
in all situations.
Note that I can see no reason for the osm plugin to use a cgi hook at all.
This could just as well be a static html page!
Foo->Bar->can("method") works just as well, even if Foo::Bar is not
loaded. Using UNIVERSAL::can is deprecated.
But, I was unable to easily eliminate conditional.pm's use of UNIVERSAL::can
without this patch, linkmaps display underscores and underscore escape
sequences in the rendered output.
this introduces a pageescape function, which invoces pagetitle() to get
rid of underscore escapes and wraps the resulting utf8 string
appropriately for inclusion in a dot file (using dot's html encoding
because it can represent the '\"' dyad properly, and because it doesn't
need special-casing of newlines).
Build links the right way.
This also involved dropping that leading slash on the osm_default_icon.
And since it would require changing the old osm_tag_icons too,
I just removed that relic.
It just didn't work, but also, it didn't use writefile, which is not
desirable for security. Fixed both issues.
Also removed some unnecessary debug messages.
Add an underlay for the osm plugin.
Update links to right path to icon. Note that the osm plugin has a
pervasive bug in how it links to icons; it assumes the site is at /.
I did not attempt to fix that; it should be using urlto() to make a correct
relative link.
When the wiki is in a subdir of the git repo, a web revert would show
in recentchanges as eg, doc/index, instead of just index.
This happened because decode_git_file caches a $prefix that is dependant
on the $git_dir setting, and the revert code runs with a different
$git_dir, which polluted the $prefix for later.
Fix this by adding a with_git_dir that juggles the variables properly.
* Test that adding a text file under a name formerly tracked as
binary (and vice versa) gets the right keyword-substitution
behavior.
* Explicitly set -kkv for text files to make the tests pass.
* CVS warns in these cases about "changing keyword expansion mode",
but this is correct behavior, so filter it from stderr. Filter
stdout the same way in case we ever want to keep any of it.
* In rcs_add(), replace comments with obviousness.
strftime is a C function, it does not return decoded utf8.
Several places in ikiwiki manually decoded it, but at least two
forgot to.
Also, strftime might not return even encoded utf8, if LC_TIME is set
to a non-utf8 value. Went ahead and supported decoding whatever encoding
it uses.
The remaining direct calls to strftime() are all ones that first set
LC_TIME=C, in order to get times that are not for human display.
https://rt.cpan.org/Ticket/Display.html?id=74487
Gave up trying to support multiple YAML backends. The XS one requires ugly
manual encoding to get unicode right, and doesn't allow dumping yaml
fragments w/o the yaml header, but at least it doesn't randomly crash
on import like YAML::Mo has started to.
A diff was already truncated after 200 lines. But it could still be
arbitrarily enormous, if a spammer or other random noise source likes long
lines. That could use a lot of memory to html encode etc the diff and fill
it into the template. Truncating after 100kb seems sufficient; it allows
for 200 lines of up to 512 characters each.
In the code:
* general plugin API calls (in plugins/write order),
* VCS plugin API calls (in plugins/write order), then
* internal support routines (in alphabetical order).
In the tests:
* general meta-behavior (in no particular order, yet),
* general plugin API calls (in plugins/write order),
* VCS plugin API calls (in plugins/write order), then
* internal support routines (in semi-logical order).
mdwn: Can use the discount markdown library, via the
Text::Markdown::Discount perl module.
This is preferred if available since it's the fastest currently supported
markdown library, speeding up markdown rendering by a factor of 40.
That is to say, when only rendering a lot of markdown, discount is 40x
faster. When building a ikiwiki site, ikiwiki's other overhead gets in the
way, but I still see significant speedups. Building the ikiwiki docwiki
dropped from 62 to 45 seconds, for example.
However, when multimarkdown is enabled, Text::Markdown::Multimarkdown is
still used.
While discount contains some nonstandard markdown extensions,
including tables and footnotes, AFAICS most of them are not
enabled by default in the perl bindings.
I consider sticking to non-extended markdown a desirable thing, since this
is probably not the last markdown engine. In particular, sundown is waiting
in the wings to get packaged and get a perl binding.
----
Reviewing all the showdown extensions, here are the ones that are enabled:
centered paragraphs:
->centered<-
image sizes: [dust mite](http://dust.mite =150x150)
<style>..</style> blocks are eaten. The perl binding does not provide
access to the gathered CSS. This is not legal html anyway, so unlikely
to cause breakage.
We had a weird problem where, after moving to a new, faster server,
"git push" would sometimes fail like this:
Unpacking objects: 100% (3/3), done.
fatal: The remote end hung up unexpectedly
fatal: The remote end hung up unexpectedly
What turned out to be going on was that git-receive-pack was dying due
to an uncaught SIGPIPE. The SIGPIPE occurred when it tried to write to
the pre-receive hook's stdin. The pre-receive hook, in this case, was
able to do all the checks it needed to do without the input, and so did
exit(0) without consuming it.
Apparently that causes a race. Most of the time, git forks the hook,
writes output to the hook, and then the hook runs, ignores it, and exits.
But sometimes, on our new faster server, git forked the hook, and it
ran, and exited, before git got around to writing to it, resulting in
the SIGPIPE.
write(7, "c9f98c67d70a1cfeba382ec27d87644a"..., 100) = -1 EPIPE (Broken
pipe)
--- SIGPIPE (Broken pipe) @ 0 (0) ---
I think git should ignore SIGPIPE when writing to hooks. Otherwise,
hooks may have to go out of their way to consume all input, and as I've
seen, the races when they fail to do this can lurk undiscovered.
I have written to the git mailing list about this.
As a workaround, consume all stdin before exiting.
Using a file was sorta not right.
Note that when previewing, %pagestate is not saved, so
it has to rebuild the graph every time until that graph is saved;
then previews can use the cached data until the next time the graph
is changed.
Also note that it's stored in the destpage's pagestate. The imagemap
could vary between a page and an inlined page if wikilinks were supported.
Also, I let preview mode write real files, rather than using data: uri.
Which is ok these days, since ikiwiki tracks files created during
previewing, and cleans them up later.
In 875d550f12 I for some reason
made $page be changed when creating a discussion page, which
broke the link on the edit page. Changing page seems unnecessary,
so reverted that part of the change.
Involved dropping some checks for .svn which didn't add anything, since if
svn is enabled and you point it at a non-svn checkout, you get both pieces.
The tricky part is add and rename, in both cases the new file can be in
some subdirectory that is not added to svn.
For add, turns out svn has a --parents that will deal with this by adding
the intermediate directories to svn as well.
For rename though, --parents fails if the directories exist but are not
yet in svn -- which is exactly the case, since ikiwiki makes them
by calling prep_writefile. So instead, svn add the parent directory,
recursively.
tldr; svn made a reasonable change in dropping the .svn directories from
everywhere, but the semantics of other svn commands, particularly their
pickiness about whether parent directories are in svn or not, means
that without the easy crutch of checking for those .svn directories,
code has to tiptoe around svn to avoid pissing it off.
There's a nice message if the plugin is loaded and used and highlight is
not available, and a nice fallback. So no need for this other warning,
which can happen any time all plugins are loaded to generate a setup file.
This kind of change is scary, but this particular lock is very simply
used and so it seems ok to make it even just for better portability to
SunOS. (People still use that?)
* mercurial: openid nicknames are now used when committing. (Daniel Andersson)
* mercurial: implement rcs_commit_staged so comments, attachments, etc
can be used. (Daniel Andersson)
* mercurial: fix viewing of a diff containing non-utf8 changes.
(Daniel Andersson)
* rename: Fix logic error that broke renaming pages when the attachment
plugin was disabled.
* rename: Fix logic error that bypassed the usual pagespec checks.
If a page that looks like an email address exists, it can't be linked to.
But that's unlikely. Better to be consistent; before this change, a
wikilink with an email address in it could link to the email address or a
page, depending on when the page was created and when the page with the
link was updated.
Imagemagick does not generate svg images very well, but it can convert
them to png quite well.
For browsers that don't yet support displaying svg, this also provides a
workaround; just scale the svg down to get a png. But the workaround is
partial, since scaling the image larger, or leaving it the same size will
cause the original svg to be displayed. Since browsers are actively
improving svg support, this is good enough for me.
Firefox sent an accept header for application/xml, not application/json,
and also weakened the priority to 0.8. So that stuff is not to be trusted;
instead I found a better way: When an ajax upload is *not* being made,
the Upload Attachment button will be used, so enable ajax if an upload
is being made without that button having been used.
Also, testing with firefox revealed it refused to process a response that
was type application/json, and checking the demo page for the jquery file
upload plugin, it actually returns the json with type text/html. Ugh.
Followed suite.
Now tested with: chromium, chromium (w/o js), firefox, firefox (w/o js),
and w3m.
Needed for attachment to return json when requested.
I think some browsers send Accept: * , so I made sure to check that json
was explicitly listed as to be accepted, as well as having a high
priority.
Left out confirmation of removal for held attachments because
a) they're not in the wiki yet, so confirmation is a bit unnecessary
b) it would be hard
c) eases later integration of jquery file upload interface
Also changed where attachments of index are held (to match where they're
stored in the srcdir).
Note that the attachment formbuilder hook was made to run last, so that
the list of attachments is not generated before removal, in the fast path
w/o confirm.