Now aggregation will not lock the wiki. Any changes made during aggregaton are
merged in with the changed state accumulated while aggregating. A separate
lock file prevents multiple concurrent aggregators. Garbage collection
of orphaned guids is much improved. loadstate() is only called once
per process, so tricky support for reloading wiki state is not needed.
(Tested fairly thuroughly.)
license, and copyright. This can be used to create custom RecentChanges.
* meta: To support the pagespec functions, metadata about pages has to be
retained as pagestate.
* Fix encoding bug when pagestate values contained spaces.
I kept it to a simple global configuration, rather than using the
preprocessor directive for recentchanges, because that had chicken and egg
problems and seemed overcomplicated. This should work reasonably well,
though it would be good to add some more metadata so that more customised
recentchanges pages can be made.
remove the enclosing paragraph and newline markdown wraps it in.
This allows removing several hacks around this markdown behavior from
other plugins that htmlize fragements of pages.
returned (and not run in some cases) rather than the plugins directly
forcing a user to log in.
* opendiscussion: allow editing of the toplevel discussion page,
and, indirectly, allow creating new discussion pages.
the needsbuild hook. This resulted in feeds not being removed when pages
were updated, and probably other bugs.
* aggregate: Avoid uninitialised value warning when removing a feed that
has an expired guid.
links required meta to be run during scan, which complicated its data
storage, since it had to clear data stored during the scan pass to avoid
duplicating it during the normal preprocessing pass.
* If you used "meta link", you should switch to either "meta openid" (for
openid delegations), or tags (for internal, invisible links). I assume
that nobody really used "meta link" for external, non-openid links, since
the htmlscrubber ate those. (Tell me differently and I'll consider bringing
back that support.)
* meta: Improved data storage.
* meta: Drop the hackish filter hook that was used to clear
stored data before preprocessing, this hack was ugly, and broken (cf:
liw's disappearing openids).
* aggregate: Convert filter hook to a needsbuild hook.
and forces rebuilds of the pages that contain calendars. So
running ikiwiki --refresh at midnight is now enough, no need for a full
wiki rebuild each midnight.
* calendar: Work around block html parsing bug in markdown 1.0.1 by
enclosing the calendar in an extra div.
which has been reported to cause encoding problems (though I haven't
reproduced them), just catch a failure of markdown, and retry.
(The crazy perl bug magically disappears on the retry.)
Closes: #449379
to be created owned by some group other than the default. Useful
then there's a shared repository with access controlled by a group,
to let ikiwiki run setgid to that group.
* ikiwiki-mass-rebuild: Run build with the user in all their groups.
in the wikilink looked like a table field separator. Avoid this ambiguity
by linkifying the data before parsing it as a table.
* Turn on allow_loose_quotes in the table plugin's Text::CSV object,
so that links from wikilinks don't confuse the parser.
* Plugins can add new directories to the search path with the add_underlay
function.
* Split out smiley underlay files into a separate underlay, so if the plugin
isn't used, the wiki isn't bloated with all those files.
* Support building on systems that lack asprintf.
* mercurial getctime is currently broken, apparently by some change in
mercurial version 0.9.4. Turn the failing test case into a TODO test case.
old files.
* Change where the img plugin puts scaled images. It's better to make the
scaled images subpages of the page that embeds them, rather than putting
them alongside the original image, since if two pages scale the same image
the same way, this prevents complications in dealing with two pages
creating the same file. The move will be handled transparently, though you
might want to rebuild your wiki to make it occur in one step.
until the wiki is building and already locked, unless it's aggregating.
When aggregating, it does not wait for the lock if it cannot get it, and
instead exits, to prevent aggregating processes from piling up.