The simple implementation of this, which I'd prefer to use, would be:
if we can import LWPx::ParanoidAgent, use it; otherwise, use
LWP::UserAgent.
However, aggregate has historically worked with proxies, and
LWPx::ParanoidAgent quite reasonably refuses to work with proxies
(because it can't know whether those proxies are going to do the same
filtering that LWPx::ParanoidAgent would).
Signed-off-by: Simon McVittie <smcv@debian.org>
This prevents the aggregate plugin from being used to read the contents
of local files via file:/// URLs.
Signed-off-by: Simon McVittie <smcv@debian.org>
The input to filter hooks is meant to be the content of a source file
on disk. If we only filter once per (page, destpage) pair, and a page
is inlined into the same destpage more than once, then the second
occurrence will render as the result of htmlizing .po source as if
it was Markdown (or whatever the type of the corresponding master page
is), which is never going to end well.
The alreadyfiltered mechanism was added in commit 1e874b3f to avoid
preprocessing loops, but I'm not sure where it could lead to a loop:
filter hooks are only called from IkiWiki::filter, which is only called
on page content from disk or on proposed content being previewed.
According to <https://bugs.debian.org/911356#41>, deleting the
alreadyfiltered mechanism resolves the problem, as well as simplifying
the code.
Closes: #911356
Tested-by: intrigeri
My previous attempt to reproduce this bug used a non-alphanumeric
ASCII character. This is not currently considered to be a valid
value for rootpage, although for a "do what I mean" approach, perhaps
we should accept it and pass it through titlepage() or linkpage().
Using Chinese characters (which are considered to match [[:alnum:]]
even though the Chinese script is not, strictly speaking, an alphabet),
as in the original bug report, reproduces the bug.
Signed-off-by: Simon McVittie <smcv@debian.org>
By processing the pagenames through linkpage, we let users specify
page names that contain non-alphanumerics in a more natural way.
Signed-off-by: Simon McVittie <smcv@debian.org>
This is one of several possible bug reports on
"doc/bugs/About %2F problem" (I'm not sure what the actual bug being
reported is).
Signed-off-by: Simon McVittie <smcv@debian.org>
I'm still not completely sure how it happened, and I can't reproduce
it myself, but in the Debian build of ikiwiki 3.20180105, wikitext.pm
ended up empty. The build fixes in commits 3aacac3b, efcbeaa0,
b32480f0 hopefully fixed this.
Signed-off-by: Simon McVittie <smcv@debian.org>
Unconditionally passing arbitrary numbers as flags turns out to be a
bad idea, because some of the "unused" values have historically had
side-effects internal to libdiscount. Detect whether the known flags
work by rendering short Markdown snippets the first time we htmlize,
checking whether each known flag is both necessary and sufficient.
Signed-off-by: Simon McVittie <smcv@debian.org>
This used to work, but has been interpreted as a literal part of
the filename since ImageMagick 6.9.8-3. In newer versions, there does
not seem to be any way to indicate that a filename containing ':' is
to be taken literally without first knowing the decoder to use.
Signed-off-by: Simon McVittie <smcv@debian.org>
Remove openid provider icons from login selector, since openid providers
are increasingly not working. Verisign retired theirs, and aol and
yahoo/flickr are not commonly used for openid. Any users who still clicked
those icons to login will need to instead enter their openid url.
This commit was sponsored by andrea rota.
This also exercises the typical centralized git repository workflow,
where changes flow from a non-bare clone (for example on a laptop)
to a centralized bare repository, then from the centralized bare
repository to a non-bare clone that is ikiwiki's srcdir.
Signed-off-by: Simon McVittie <smcv@debian.org>
This still smuggles it past the sanitize step, but avoids having
other plugins that want to capture text content without markup
(notably toc) see the CSS as if it was text content.
This hopefully fixes a race condition in which the test failed
around 6% of the time.
If we don't wait, the mtime (which is rounded down to 1 second precision
in the APIs we use) will not necessarily change, so the update will not
necessarily cause the page to be refreshed.
Bug-Debian: https://bugs.debian.org/862494
Previously it was relying on running with an installed ikiwiki
and being able to copy in recentchanges.mdwn and wikiicons/ from the
underlay in /usr. The underlay in ./underlays/basewiki can't be used
(yet) because ikiwiki doesn't allow following symlinks, even from
underlays.
I'd like to make ikiwiki follow symlinks whose destinations can be
verified to be safe (for example making it willing to expose
/usr/share/javascript to the web, but not /etc/passwd), at least from
underlays, but this is security-sensitive so I'm not going to rush
into it.
Current Perl versions put '.' at the end of the library search path
@INC, although this will be fixed in a future Perl release. This means
that when software loads an optionally-present module, it will be
looked for in the current working directory before giving up. An
attacker could use this to execute arbitrary Perl code from ikiwiki's
current working directory.
Removing '.' from the library search path in Perl is the correct
fix for this vulnerability, but is not trivial to do due to
backwards-compatibility concerns. Mitigate this (even if ikiwiki is run
with a vulnerable Perl version) by explicitly removing '.' from the
search path, and instead looking for ikiwiki's own modules relative
to the absolute path of the executable when run from the source
directory.
In tests that specifically want to use the current working directory,
use "-I".getcwd instead of "-I." so we use its absolute path, which
is immune to the removal of ".".
Otherwise, if third-party plugins extend newenviron by more than
3 entries, we could overflow the array. It seems unlikely that any
third-party plugin manipulates newenviron in practice, so this
is mostly theoretical. Just in case, I have deliberately avoided
using "i" as the variable name, so that any third-party plugin
that was manipulating newenviron directly will now result in the
wrapper failing to compile.
I have not assumed that realloc(NULL, ...) works as an equivalent of
malloc(...), in case there are still operating systems where that
doesn't work.