aggregate: When a feed has an enclosure that is an image, audio, or video,
include the enclosure in the generated page.
The enclosure is hotlinked from the original feed, not copied.
My use case is to include a mastodon rss feed in amoung other rss feeds for
users who don't use mastodon. It could also be used to aggregate together
podcasts, etc.
Other enclosure types than image, audio, video, could be added, perhaps
a generic one? But these are the main ones.
The template uses 50% width for image and video, because often
attachments are in a high resolution, which will default to being
perhaps too wide for the page, or taking up a lot of vertical space. By
making it take up at most half the page width, that is avoided, while
also leaving room for any sidebar.
Sponsored-by: Shae Erisson on Patreon
As of 3.51, searchFile() is no longer provided in highlight's Perl
bindings (at least on NetBSD and OS X, as built from pkgsrc). This
leaves us falling through to getConfDir(), which has been gone
rather longer.
From highlight git, it appears searchFile() and getFiletypesConfPath()
both originated in the 3.14 release. The latter is still available in
3.51, and returns the same result searchFile() used to. Switch to it.
The simple implementation of this, which I'd prefer to use, would be:
if we can import LWPx::ParanoidAgent, use it; otherwise, use
LWP::UserAgent.
However, aggregate has historically worked with proxies, and
LWPx::ParanoidAgent quite reasonably refuses to work with proxies
(because it can't know whether those proxies are going to do the same
filtering that LWPx::ParanoidAgent would).
Signed-off-by: Simon McVittie <smcv@debian.org>
The input to filter hooks is meant to be the content of a source file
on disk. If we only filter once per (page, destpage) pair, and a page
is inlined into the same destpage more than once, then the second
occurrence will render as the result of htmlizing .po source as if
it was Markdown (or whatever the type of the corresponding master page
is), which is never going to end well.
The alreadyfiltered mechanism was added in commit 1e874b3f to avoid
preprocessing loops, but I'm not sure where it could lead to a loop:
filter hooks are only called from IkiWiki::filter, which is only called
on page content from disk or on proposed content being previewed.
According to <https://bugs.debian.org/911356#41>, deleting the
alreadyfiltered mechanism resolves the problem, as well as simplifying
the code.
Closes: #911356
Tested-by: intrigeri
Javascript resources should be presented to browsers after CSS, and
"after the fold" (ATF) according to the best practices:
https://developers.google.com/speed/docs/insights/mobile#PutStylesBeforeScripts
This change allows the browser to download Javascript files in
parallel, by including Javascript on the *closing* </body> tag instead
of the opening tag.
We also improve the regex to tolerate spaces before the body tag, as
some templates have (proper) indentation for the tag.
By processing the pagenames through linkpage, we let users specify
page names that contain non-alphanumerics in a more natural way.
Signed-off-by: Simon McVittie <smcv@debian.org>
This better explains what it contains, which is a wikilink to the page
to go to after posting the vote. And postlink is more consistent a name
with posttrail.
Modern web users probably expect the poll to move on automatically to the
next question, and this allows for that behavior.
Note that bestlink() runs at vote time, which avoids needing to make the
page containing the poll depend on the page that sets up a trail, as the
current trail at vote time will be used.
This commit was sponsored by Eric Drechsel on Patreon.
When an aggregated post lacked a title, the code first prepended the
$feed->{dir} to it, and only then checked if it had zero length. So,
that check could never succeed and it was possible to end up with
$page="dir/", and writing to that would of course fail.
(Same problem could also occur when the whole title got sanitized away by the
wiki_file_regexp.)
Fixed by simply checking earlier if $page is empty.
Based on a patch by Alexandre Oliva which got lost in a maze of email
folders all alike for over two years despite him mentioning it to me at
least once in person.
You can still use [[!meta name="date" content="..."]] to generate
<meta> tags that are not interpreted, but the common case for
[[!meta date="..."]] is that you want to change the ctime, and that
won't work without Date::Parse.
Signed-off-by: Simon McVittie <smcv@debian.org>
If for some reason you want to create <meta name="date" content="12345">,
this now requires [[!meta name="date" content="12345"]].
Signed-off-by: Simon McVittie <smcv@debian.org>
Unconditionally passing arbitrary numbers as flags turns out to be a
bad idea, because some of the "unused" values have historically had
side-effects internal to libdiscount. Detect whether the known flags
work by rendering short Markdown snippets the first time we htmlize,
checking whether each known flag is both necessary and sufficient.
Signed-off-by: Simon McVittie <smcv@debian.org>
An empty coder name used to detect the format implicitly, but has been
interpreted as a literal part of the filename since ImageMagick 6.9.8-3.
In newer versions, there does not seem to be any way to indicate that
a filename containing ':' is to be taken literally without first
knowing the decoder to use.
Signed-off-by: Simon McVittie <smcv@debian.org>
The Discount package in Debian historically enabled fenced code blocks,
PHP Markdown Extra-style definition lists, and an expanded character
set for tag names. Since Discount 2.2.0 those are runtime settings, so
enable them. Unfortunately Text::Markdown::Discount doesn't yet expose
the necessary constants:
https://rt.cpan.org/Public/Bug/Display.html?id=124188
The IDANCHOR option was historically also enabled in Debian, but is not
enabled here because ikiwiki does not enable the TOC option, and
IDANCHOR does nothing without TOC.
Closes: #888055
* emailauth: Fix cookie problem when user is on https and the cgiurl
uses http, by making the emailed login link use https.
* passwordauth: Use https for emailed password reset link when user
is on https.
Not entirely happy with this approach, but I don't currently see a
better one.
I have not verified that the passwordauth change fixes any problem,
other than the user getting a http link when they were using https.
The emailauth problem is verified fixed by this commit.
This commit was sponsored by Michael Magin.
Due to the use/abuse of CGI::Session to generate a token for the login
process, a new session database was created for each login, and left behind
afterwards. While each file is small, with many logings this could bloat
the size of /tmp significantly. Fixed by making CGI::Session write to
/dev/null, since there does not seem to be a way to entirely prevent the
writing.
This commit was sponsored by Henrik Riomar on Patreon.
Those were not in the original html5 spec, but have been added in the
whatwg html living standard and have wide browser support.
This commit was sponsored by John Peloquin on Patreon.
savestate is not the right place to write wiki content, and in particular
this breaks websetup if osm's dependencies are not installed, even if
the osm plugin is not actually enabled. (Closes: #719913)
This is not a full solution: it should be possible to render the PoI files
for only the maps that changed, from the format, changes or rendered
hook. However, getting that right would require more understanding of
this plugin, and this version is enough to not break websetup. This
version is the closest correct hook to the one where this previously
took place.
This still smuggles it past the sanitize step, but avoids having
other plugins that want to capture text content without markup
(notably toc) see the CSS as if it was text content.
reasoning: if headings have identifiers, they are probably more useful
anchors than the automatically generated anchors we build in the toc
plugin. this can happen if, for example, you use the `multimarkdown`
plugin, which inserts `id` tags for every header it encounters. this
also leverages the `headinganchors` plugin nicely.
keeps backwards-compatibility with old toc-generated #indexXhY
anchors.
This avoids misinterpreting initials ("C. S. Lewis was an author"),
the abbreviation for Monsieur ("M. Descartes was a philosopher") and
German page numbering ("S. 42") as ordered lists if they happen to
begin a line.
This only affects the default Discount implementation: Text::Markdown
and Text::MultiMarkdown do not have this feature anyway. A new
mdwn_alpha_list option can be used to restore the old interpretation.
The Perl binding defaults to MKD_NOHEADER|MKD_NOPANTS anyway, but
making them explicit means we can use other flags of our choice,
and makes it easier to justify why those flags are appropriate.
ikiwiki's web interface does not currently have UI for removing
multiple pages simultaneously, but the remove plugin is robust
against doing so. Use a clearer idiom to make that obvious.
These instances of code similar to OVE-20170111-0001 are not believed
to be exploitable, because defined(), length(), setpassword(),
userinfo_set() and the binary "." operator all have prototypes that
force the relevant argument to be evaluated in scalar context. However,
using a safer idiom makes mistakes less likely.
(cherry picked from commit 69230a2220f673c66b5ab875bfc759b32a241c0d)
Calling CGI::FormBuilder::field with a name argument in list context
returns zero or more user-specified values of the named field, even
if that field was not declared as supporting multiple values.
Passing the result of field as a function parameter counts as list
context. This is the same bad behaviour that is now discouraged
for CGI::param.
In this case we pass the multiple values to CGI::Session::param.
That accessor has six possible calling conventions, of which four are
documented. If an attacker passes (2*n + 1) values for the 'name'
field, for example name=a&name=b&name=c, we end up in one of the
undocumented calling conventions for param:
# equivalent to: (name => 'a', b => 'c')
$session->param('name', 'a', 'b', 'c')
and the 'b' session parameter is unexpectedly set to an
attacker-specified value.
In particular, if an attacker "bob" specifies
name=bob&name=name&name=alice, then authentication is carried out
for "bob" but the CGI::Session ends up containing {name => 'alice'},
an authentication bypass vulnerability.
This vulnerability is tracked as OVE-20170111-0001.
(cherry picked from commit e909eb93f4530a175d622360a8433e833ecf0254)
git_sha1 already puts "--" before its arguments, so
git_sha1_file($dir, 'doc/index.mdwn')
would have incorrectly invoked
git rev-list --max-count=1 HEAD -- -- doc/index.mdwn
If there is no file in the wiki named "--", that's harmless, because
it merely names the latest revision in which either "--" or
"doc/index.mdwn" changed. However, it could return incorrect results
if there is somehow a file named "--".
If we throw an exception (usually from run_or_die), in_git_dir won't
unshift the current directory from the stack. That's usually fine,
but in rcs_preprevert we catch exceptions and do some cleanup before
returning, for which we need the git directory to be the root and
not the temporary working tree.
Some of these might be relatively expensive to dereference or result
in messages being logged, and there's no reason why a search engine
should need to index them. (In particular, we'd probably prefer search
engines to index the rendered page, not its source code.)
We exclude .git/hooks from symlinking into the temporary working tree,
which avoids the commit hook being run for the temporary branch anyway.
This avoids the wiki not being updated if an orthogonal change is
received in process A, while process B prepares a revert that is
subsequently cancelled.
Otherwise, we have a time-of-check/time-of-use vulnerability:
rcs_preprevert previously looked at what changed in the commit we are
reverting, not at what would result from reverting it now. In
particular, if some files were renamed since the commit we are
reverting, a revert of changes that were within the designated
subdirectory and allowed by check_canchange() might now affect
files that are outside the designated subdirectory or disallowed
by check_canchange().
It is not sufficient to disable rename detection, since git older
than 2.8.0rc0 (in particular the version in Debian stable) silently
accepts and ignores the relevant options.
OVE-20161226-0002