Previously it was relying on running with an installed ikiwiki
and being able to copy in recentchanges.mdwn and wikiicons/ from the
underlay in /usr. The underlay in ./underlays/basewiki can't be used
(yet) because ikiwiki doesn't allow following symlinks, even from
underlays.
I'd like to make ikiwiki follow symlinks whose destinations can be
verified to be safe (for example making it willing to expose
/usr/share/javascript to the web, but not /etc/passwd), at least from
underlays, but this is security-sensitive so I'm not going to rush
into it.
Current Perl versions put '.' at the end of the library search path
@INC, although this will be fixed in a future Perl release. This means
that when software loads an optionally-present module, it will be
looked for in the current working directory before giving up. An
attacker could use this to execute arbitrary Perl code from ikiwiki's
current working directory.
Removing '.' from the library search path in Perl is the correct
fix for this vulnerability, but is not trivial to do due to
backwards-compatibility concerns. Mitigate this (even if ikiwiki is run
with a vulnerable Perl version) by explicitly removing '.' from the
search path, and instead looking for ikiwiki's own modules relative
to the absolute path of the executable when run from the source
directory.
In tests that specifically want to use the current working directory,
use "-I".getcwd instead of "-I." so we use its absolute path, which
is immune to the removal of ".".
Otherwise, if third-party plugins extend newenviron by more than
3 entries, we could overflow the array. It seems unlikely that any
third-party plugin manipulates newenviron in practice, so this
is mostly theoretical. Just in case, I have deliberately avoided
using "i" as the variable name, so that any third-party plugin
that was manipulating newenviron directly will now result in the
wrapper failing to compile.
I have not assumed that realloc(NULL, ...) works as an equivalent of
malloc(...), in case there are still operating systems where that
doesn't work.
This mitigates CVE-2016-3714. Wiki administrators who know that they
have prevented arbitrary code execution via other formats can re-enable
the other formats if desired.
If the relative link from the (page generating the) RSS to the target
would start with "./" or "../", just concatenating it with the URL to
the directory containing the RSS is not sufficient. Go via
URI::new_abs to fix this.
$im->Read() takes a filename-like argument with several sets of special
syntax. Most of the possible metacharacters are escaped by the
default `wiki_file_chars` (and in any case not particularly disruptive),
but the colon ":" is not.
It seems the way to force ImageMagick to treat colons within the
filename as literal is to prepend a colon, so do that.
[[forum/refresh_and_setup]] indicates some confusion between --setup
and -setup. Both work, but it's clearer if we stick to one in
documentation and code.
A 2012 commit to [[plugins/theme]] claims that "-setup" is required
and "--setup" won't work, but I cannot find any evidence in ikiwiki's
source code that this has ever been the case.
Discount in current Debian unstable turns the IURI href into a URI
by encoding the Unicode as UTF-8 and %-escaping each byte.
That is valid, and matches Wikipedia's expectations, but was breaking
this test for me.
It would also be entirely valid (and lead to equivalent parsing) if the
ö was represented as ö, ö or ö in the text and/or the href.
Inkscape loses the bounding box of a SVG with no content when it
converts it to EPS, and ImageMagick does not have a special case for
converting SVG to PNG with Inkscape in one step (which Inkscape can do);
it prefers to convert SVG to EPS with Inkscape, then EPS to whatever.
This means that people can do XSLT nonsense if they want to.
The failures are currently marked TODO because not everything in the
docwiki is in fact well-formed.
According to caniuse.com, a significant fraction of Web users are
still using Internet Explorer versions that do not support HTML5
sectioning elements. However, claiming we're XHTML 1.0 Strict
means we can't use features invented in the last 12 years, even if
they degrade gracefully in older browsers (like the role and placeholder
attributes).
This means our output is no longer valid according to any particular
DTD. Real browsers and other non-validator user-agents have never
cared about DTD compliance anyway, so I don't think this is a real loss.
I didn't try to parameterize when a test should fail when we can't
remove ikiwiki.cgi because there already isn't one. (Hooray, natural
language.) Instead, we stop worrying about it and always tolerate
ENOENT.