... as Joey suggested on todo/need_global_renamepage_hook
This hook is applied recursively to returned additional rename
hashes, so that it handles the case where two plugins use the hook:
plugin A would see when plugin B adds a new file to be renamed.
The full set of rename hashes can no longer be changed by hook functions, that
are only allowed to return any additional rename hashes it wants to add.
Rationale: the correct behavior of the recursion would be hard, if not
impossible, to define, if already considered pages were changing on the run.
Signed-off-by: intrigeri <intrigeri@boum.org>
This means that the underlay needs to have a wmd/wmd/wmd.js,
which is a trifle weird, but it isolates all the wmd stuff in a
single wmd subdirectory of the built wiki. The wmd/images creating
a toplevel images directory was particularly bad.
This is likely a misconfiguration and can cause login to fail as the
browser refuses the send the session cookie back over http.
Not entirely happy with putting the check where I did, since users have to
try to log in, and fail, to see the misconfiguration explained. But I could
not find a better place to put the check.
This is potentially expensive, but is necessary so that meta and tag
directives, and other links on templates affect the page using the template
reliably.
It no longer makes sense to keep these functions in editpage, because
serveral plugins now exist that use them, and users may want to disable
editpage, while leaving those plugins enabled.
Most notably, comments uses both functions, and it's entirely appropriate
to disable editpage but still want to have comments enabled.
Less likely, attachments, rename, and remove all use check_canedit -- but
it would be unusual indeed to want to use these w/o editpage.
Falls back to looking for shortcuts.mdwn for backwards compatabiity; there
probably exist wikis that have changed the pageext but still use
shortcuts.mdwn.
See [[bugs/Aggregated_Atom_feeds_are_double-encoded]]. By default,
XML::Atom outputs strings of UTF-8 bytes with the Perl UTF8 flag stripped
off, which IkiWiki assumes to be Latin-1 and re-encodes as UTF-8 on
output. XML::Feed does not currently (0.41-1) set the magic variable to
change this behaviour (I've filed a bug on CPAN), but IkiWiki can
usefully set the same variable as a workaround.
This may already work with other web servers that have copied apache's
interface, and it should be easy to add support to it for web servers that
use some other interface. So, make the name more general.
This redirects to the given page (or if none is given, the page parameter
given to the CGI), or displays an error with a create link if the page
doesn't exist.
... as my meta branch probably won't be merged before the po plugin is, contrary
to what I was originally supposing.
This implies removing the po_translation_status_in_links and
po_strictly_refresh_backlinks options.
Added a note to the TODO section to think of bringing these features back later,
as they really enhance user experience on a translatable wiki.
Signed-off-by: intrigeri <intrigeri@boum.org>
This is intended to solve Joey's concerns expressed on
http://ikiwiki.info/todo/need_global_renamepage_hook/, i.e. the need to make it
possible to use this hook from external plugins.
A plugin using this hook still can add/modify/remove elements of the
@torename array.
Signed-off-by: intrigeri <intrigeri@boum.org>
... that was removed in 68869d664b
Without this scalar, a two-cells array is passed to $template->param, which
builds a hash with an odd number of elements.
Signed-off-by: intrigeri <intrigeri@boum.org>
After some thinking about it, I can't find why the type of a page being created
in the CGI could be restricted to po. So the previous case seems enough.
Signed-off-by: intrigeri <intrigeri@boum.org>
check_canremove/canrename is called only for its side effect (of failing if
removal is not allowed), its return value is never used and returning
something makes that unclear
use is file-scoped so warnings and strict are already enabled
inside the second package, and IkiWiki is already loaded
(though not imported into this context)
It was calling format hooks for each comment on the page.
When relativedate is enabled, that made it insert <script> tags
for each comment. And the browser loaded the same script over and over,
which was slow on its own. But that was nothing compared to running
the onload even over and over.. especially since the hook system
added a new call to the hook each time it loaded.
For a page with 10 comments, that caused the relativedate DOM parsing
code to run 1000 times, I think. Anyway, it was sloow. Now it runs once.
if suitable alternate text is unknown, then it should not be given.
empty alt text is suitable mainly for purely decorative images.
(cherry picked from commit 3cd7f67f0cf894f4fd5ba16f68e82e4f7bdbfdc5)
Always pass the full (modified) content in `content` named parameter. When the
user edits an existing wiki page, also pass a `diff` named parameter, which
includes only the lines that they added to the page, or modified.
Signed-off-by: intrigeri <intrigeri@boum.org>
Some aggregators, like Planet, sort by mtime rather than ctime. This
means that posts with modified content come to the top (which seems odd
to me, but is presumably what the aggregator's author or operator
wants), but it also means that posts with insignificant edits (like
adding tags) come to the top too. Atom defines <updated> to be the date
of the last *significant* change, so it's fine that ikiwiki defaults to
using the mtime, but it would be good to have a way for the author to
say "that edit was insignificant, don't use that mtime".
That resulted in double encoded display when using perl's stub
readline module. Apparently that module unconditionally upgrades
text to utf8, in a quite braindead way.
(Term::ReadLine::Gnu::Perl worked ok.)
Use mtn for monontone and hg for mercurial. The long names cause ugly
formatting in recentchanges, which has CSS that only allows a few
characters for the commit type column.
All meta titles are first extracted at scan time, i.e. before we turn
PO files back into translated markdown; escaping of double-quotes in
PO files breaks the meta plugin's parsing enough to save ugly titles
to %pagestate at this time.
Then, at render time, every page's passes on row through the Great
Rendering Chain (filter->preprocess->linkify->htmlize), and the meta
plugin's preprocess hook is this time in a position to correctly
extract the titles from slave pages.
This is, unfortunately, too late: if the page A, linking to the page B,
is rendered before B, it will display the wrongly-extracted meta title
as the link text to B.
On the one hand, such a corner case only happens on rebuild: on
refresh, every rendered page is fixed to contain correct meta titles.
On the other hand, it can take some time to get every page fixed.
We therefore re-render every rendered page after a rebuild to fix them
at once. As this more or less doubles the time needed to rebuild the
wiki, we do so only when really needed.
Signed-off-by: intrigeri <intrigeri@boum.org>
It will set up an ikiwiki instance tuned for use in blogging.
As part of this change, move the example sites into /usr/share/ikiwiki so
they are available even if docs are not installed.
Asking for only the head worked in my tests, but I've found a site where it
didn't -- apparently ikiwiki didn't get a chance to do or finish the
refresh when HEADed. Getting the whole url, waiting for ikiwiki to finish,
avoided the update problem.
* repolist: New plugin to support the rel=vcs-* microformat.
* goodstuff: Include repolist by default. (But it does nothing until
configured with the repository locations.)
Form validation works, but after trying to save invalid PO content, the user is
brought back to the page he/she was editing, without any single clue to explain
why it was not saved. The dedicated cansave hook is thus necessary.
Signed-off-by: intrigeri <intrigeri@boum.org>
This has to be done after the rename/remove plugins have added
their buttons, so we set this hook to be run last.
The canrename/canremove hooks already ensure this is forbidden
at the backend level, so this is only UI sugar.
Signed-off-by: intrigeri <intrigeri@boum.org>
The main reason to do so is to bypass the "favor the type of linking page on
page creation" logic, which is unsuitable when a broken link is clicked on
a slave (PO) page.
Signed-off-by: intrigeri <intrigeri@boum.org>
This is not needed by the use I'm doing of it, but seems more consistent to me.
Future users of this hook may need this data to make their mind.
Signed-off-by: intrigeri <intrigeri@boum.org>
... so that nicepagetitle hook's effects, such as translation status displayed
in links, are updated when the linked page changes.
The replacement of 'my %backlinks' with 'our %backlinks' in Render.pm made this
work: previously, every postscan hook was called with an almost empty
%backlinks, which defeated all my attempts to implement this feature.
This feature hits performance a bit. Its cost was quite small in my real-world
use-cases (a few percents bigger refresh time), but could be bigger in worst
cases. Time will tell.
NB: this hack could also be used by my meta branch. It may even be a ikiwiki
optional feature.
Signed-off-by: intrigeri <intrigeri@boum.org>
Thanks to the new rename hook behaviour, the whole renaming work is now done
by the rename plugin, and we don't need to remember which pages were renamed.
inline has a format hook that is an optimisation hack. Until this hook
runs, the inlined content is not present on the page. This can prevent
other format hooks, that process that content, from acting on inlined
content. In bug ##509710, we discovered this happened commonly for the
embed plugin, but it could in theory happen for many other plugins (color,
cutpaste, etc) that use format to fill in special html after sanitization.
The ordering was essentially random (hash key order). That's kinda a good
thing, because hooks should be independent of other hooks and able to run
in any order. But for things like inline, that just doesn't work.
To fix the immediate problem, let's make hooks able to be registered as
running "first". There was already the ability to make them run "last".
Now, this simple first/middle/last ordering is obviously not going to work
if a lot of things need to run first, or last, since then we'll be back to
being unable to specify ordering inside those sets. But before worrying about
that too much, and considering dependency ordering, etc, observe how few
plugins use last ordering: Exactly one needs it. And, so far, exactly one
needs first ordering. So for now, KISS.
Another implementation note: I could have sorted the plugins with
first/last/middle as the primary key, and plugin name secondary, to get a
guaranteed stable order. Instead, I chose to preserve hash order. Two
opposing things pulled me toward that decision:
1. Since has order is randomish, it will ensure that no accidental
ordering assumptions are made.
2. Assume for a minute that ordering matters a lot more than expected.
Drastically changing the order a particular configuration uses could
result in a lot of subtle bugs cropping up. (I hope this assumption is
false, partly due to #1, but can't rule it out.)
People seem to be able to expect to enter www.foo.com and get away with it.
The resulting my.wiki/www.foo.com link was not ideal.
To fix it, use URI::Heuristic to expand such things into a real url. It
even looks up hostnames in the DNS if necessary.
A new ikiwiki-transition moveprefs subcommand can pull the old data out of
the userdb and inject it into the setup file.
Note that it leaves the old values behind in the userdb too. I did this
because I didn't want to lose data if it fails writing the setup file for
some reason, and the old data in the userdb will only use a small amount of
space. Running the command multiple times will mostly not change anything.
This leads to better display for OpenIDs like smcv.pseudorandom.co.uk
and thm.id.fedoraproject.org (to take a couple of examples from the
IkiWiki commit history).
None of the comment state needs to be stored through the a later run of
ikiwiki, so move it all from pagestate to a more transient storage.
This is assuming that we'll never want to add pagespecs to search against
the comment state. Pagespecs like author() are why the meta plugin does
store its meta data in pagestate -- the data can be needed later to match
against.