Usage:
1. Update all pagespecs that use aggregated pages to use internal()
2. ikiwiki-transition aggregateinternal $srcdir $htmlext
(where $srcdir and $htmlext are the srcdir and htmlext options in
your .setup file)
3. Add aggregateinternal to your .setup file
4. Rebuild the wiki
This addresses <http://ikiwiki.info/todo/aggregate_to_internal_pages/>
in a simple way. With this approach, a flag day is required, on which all
users of aggregated pages start to inline them using the internal() pagespec;
after that, the aggregateinternal option can safely be switched on in the
setup file (and the old aggregated pages can be deleted by hand).
Now aggregation will not lock the wiki. Any changes made during aggregaton are
merged in with the changed state accumulated while aggregating. A separate
lock file prevents multiple concurrent aggregators. Garbage collection
of orphaned guids is much improved. loadstate() is only called once
per process, so tricky support for reloading wiki state is not needed.
(Tested fairly thuroughly.)
gettext choked on a Unicode apostrophe in the aggregate plugin, which
appeared in a new error message in commit
4f872b5633. Replace it with an ASCII
apostrophe.
the code, since that process can change internal state as needed, and
it will automatically be cleaned up for the parent process, which proceeds
to render the changes.
the needsbuild hook. This resulted in feeds not being removed when pages
were updated, and probably other bugs.
* aggregate: Avoid uninitialised value warning when removing a feed that
has an expired guid.
links required meta to be run during scan, which complicated its data
storage, since it had to clear data stored during the scan pass to avoid
duplicating it during the normal preprocessing pass.
* If you used "meta link", you should switch to either "meta openid" (for
openid delegations), or tags (for internal, invisible links). I assume
that nobody really used "meta link" for external, non-openid links, since
the htmlscrubber ate those. (Tell me differently and I'll consider bringing
back that support.)
* meta: Improved data storage.
* meta: Drop the hackish filter hook that was used to clear
stored data before preprocessing, this hack was ugly, and broken (cf:
liw's disappearing openids).
* aggregate: Convert filter hook to a needsbuild hook.
page name to be expired and reused for several distinct guids. When this
happened, the expiry code counted each past guid that had used that page
name as a currently existing page, and thus expired too many pages.
until the wiki is building and already locked, unless it's aggregating.
When aggregating, it does not wait for the lock if it cannot get it, and
instead exits, to prevent aggregating processes from piling up.
for extended pagespecs. The old calling convention will still work for
back-compat for now.
* The calling convention for functions in the IkiWiki::PageSpec namespace
has changed so they are passed named parameters.
* Plugin interface version increased to 2.00 since I don't anticipate any
more interface changes before 2.0.
on and supported creating it (especially Tumov). This adds a "usedirs"
option that makes ikiwiki use foo/index.html instead of foo.html as
output page names. It is not yet enabled by default.
including out of disk space situations. ikiwiki should never leave
truncated files, and if the error occurs during a web-based file edit,
the user will be given an opportunity to retry.
Inspired by the many ways Moin Moin destroys itself when out of disk. :-)
* Fix syslogging of errors.