master
joey 2006-08-02 05:32:10 +00:00
parent fae3da7ce3
commit 044a5eba9a
2 changed files with 75 additions and 6 deletions

View File

@ -6,12 +6,27 @@ This only happens if the page is removed from the inlined pagespec due to
a tag changing; the problem is that once the tag is changed, ikiwiki does
not know that the page used to match before.
Another example would be a pagespec that allowed only matching new pages:
newer(1 day)
Obviously, the pages that matches are going to change, and again once they
do, ikiwiki will no longer know that they matched before, so it won't know
to remove them from a page that used that to inline them.
To fix, seems I would need to record the actual list of pages that are
currently included on an inline page, and do a comparison to see if any
have changed. At first I thought, why not just add them to the dependencies
explicitly, but that fails because the dependencies pagespec fails to match
when a negated expression like "!tag(bugs/done)" is matched.
have changed.
So, quick fixes aside, what's the generic mechanism here that a plugin can
use to let ikiwiki know that a page should be updated if some other page
stops matching its dependencies pagespec?
At first I thought, why not just add them to the dependencies
explicitly, but that failed because the dependencies GlobList failed to match
when a negated expression like "!tag(bugs/done)" is matched. It is,
however, doable with PageSpecs:
(real deps here) or (list of all currently inlined pages here)
However, it's not really clear to me how to _remove_ inlined pages from the
deps when they stop being inlined for whatever reason. So a separate list
would be better.
So this is blocked by [[todo/plugin_data_storage]] I suppose.

View File

@ -0,0 +1,54 @@
ikiwiki currently stores some key data in .ikiwiki/index. Some plugins need a
way to store additional data, and ideally it would be something managed by
ikiwiki instead of ad-hoc because:
* consistency is good
* ikiwiki knows when a page is removed and can stop storing data for that
page; plugins have to go to some lengths to track that and remove their
data
* it's generally too much code and work to maintain a separate data store
The aggregate plugin is a use-case: of 324 lines, 70 are data storage and
another 10 handle deletion. Also, it's able to use a format very like
ikiwiki's, but it does need to store some lists in there, which complicates
it some and means that a very naive translation between a big per-page hash
and the .index won't be good enough.
The current ikiwiki index format is not very flexible, although it is at
least fairly easy and inexpensive to parse as well as hand-edit.
Would this do: ?
* Plugins can register savestate and loadstate hooks. The hook id is the
key used in the index file that the hook handles.
* loadstate hooks are called and passed a list of all values for a page
that for the registered key, and the page name, and should store the data
somewhere
* savestate hooks are called and passed a page, and should return a list of
all values for that key for that page
* If they need anything more complex than a list of values, they will need
to encode it somehow in the list.
Hmm, that's potentially a lot of function calls per page eave load/save
though.. For less function calls, only call each hook *once* per load/save,
and it is passed/returns a big hash of pages and the values for each page.
(Which probably means `%state=@_` for load and `return %state` for save.)
It may also be better to just punt on lists, and require plugins that need
even lists to encode them. Especially since in many cases, `join(" ", @list)`
will do.
Note that for the aggregate plugin to use this, it will need some changes:
* guid data will need to be stored as part of the data for the page
that was aggregated from that guid
* All feeds will need to be marked as removable in loadstate, and only
unmarked if seen in preprocess. Then savestate will need to not only
remove any feeds still marked as such, but do the unlinking of pages
aggregated from them too.
If I do this, I might as well also:
* Change the link= link= stuff to just links=link+link etc.
* Change the delimiter from space to comma; commas are rare in index files,
so less ugly escaped delimiters to deal with.