Something has changed in CGI.pm in perl 5.10. It used to not care
if STDIN was opened using :utf8, but now it'll mis-encode utf-8 values
when used that way by ikiwiki. Now I have to binmode(STDIN) before
instantiating the CGI object.
In 57bba4dac1, I changed from decoding
CGI::Formbuilder fields to utf-8, to decoding cgi parameters before setting
up the form object. As of perl 5.10, that approach no longer has any effect
(reason unknown). To get correctly encoded values in FormBuilder forms,
they must once again be decoded after the form is set up.
As noted in 57bba4da, this can cause one set of problems for
formbuilder_setup hooks if decode_form_utf8 is called before the hooks, and
a different set if it's called after. To avoid both sets of problems, call
it both before and after. (Only remaining problem is the sheer ugliness and
inefficiency of that..)
I think that these changes will also work with older perl versions, but I
haven't checked.
Also, in the case of the poll plugin, the cgi parameter needs to be
explcitly decoded before it is used to handle utf-8 values. (This may have
always been broken, not sure if it's related to perl 5.10 or not.)
Turns out duplicate index files do not need to be stored when usedirs is in
use, just when it's not. Ikiwiki is quite consistent about using page/ when
usedirs is in use. (The only exception is the search plugin, which needs
fixing.)
This also includes significant code cleanup, removal of a incorrect special
case for empty files, and addition of a workaround for a bug in the amazon
perl module.
The fix involved embedding the session id in the forms, and not allowing the
forms to be submitted if the embedded id does not match the session id.
In the case of the preferences form, if the session id is not embedded,
then the CGI parameters are cleared. This avoids a secondary attack where the
link to the preferences form prefills password or other fields, and
the user hits "submit" without noticing these prefilled values.
In the case of the editpage form, the anonok plugin can allow anyone to edit,
and so I chose not to guard against CSRF attacks against users who are not
logged in. Otherwise, it also embeds the session id and checks it.
For page editing, I assume that the user will notice if content or commit
message is changed because of CGI parameters, and won't blndly hit save page.
So I didn't block those CGI paramters. (It's even possible to use those CGI
parameters, for good, not for evil, I guess..)
The only other CSRF attack I can think of in ikiwiki involves the poll plugin.
It's certianly possible to set up a link that causes the user to unknowingly
vote in a poll. However, the poll plugin is not intended to be used for things
that people would want to attack, since anyone can after all edit the poll page
and fill in any values they like. So this "attack" is ignorable.
tag 473987 +patch
thanks
Hi,
The issue is that we need to convert relative links to absolute
ones for atom and rss feeds -- but there are two types of
relative links. The first kind, relative to the current
document ( href="some/path") is handled correctly. The second
kind of relative url is is relative to the http server
base (href="/semi-abs/path"), and that broke.
It broke because we just prepended the url of the current
document to the href (http://host/path/to/this-doc/ + link),
which gave us, in the first place:
http://host/path/to/this-doc/some/path [correct], and
http://host/path/to/this-doc//semi-abs/path [wrong]
The fix is to calculate the base for the http server (the base of
the wiki does not help, since the base of the wiki can be
different from the base of the http server -- I have, for example,
"url => http://host.name.mine/blog/manoj/"), and prepend that to
the relative references that start with a /.
This has been tested.
Signed-off-by: Manoj Srivastava <srivasta@debian.org>
destpage does not normally need to be worried about when creating other files
as part of the process of rendering a page. Using destpage results in
inlined pages creating two copies of such files. It works to not use destpage
in this case because the inlining page depends on the source page, so if the
source page is modified or deleted the inlining page will be updated.
Instead of using the XML-RPC v2 extension <nil/>, which Perl's
XML::RPC::Parser does not (yet) support (Joey's patch is pending), we
agreed on a sentinel: {'null':''}, that is, a hash with a single key
"null" pointing to the empty string.
The Python proxy automatically converts None appropriately and raises an
exception if a hook function should, by weird coincidence, attempt to
return {'null':''}.
Signed-off-by: martin f. krafft <madduck@madduck.net>
Markdown is slow. Especially if it has to process an enormous page. The
most common enormous page is currently the recentchanges page, which gets
processed a lot, and contains very little actual markdown. Most of it is a
big <div>, which markdown skips ... slowly.
This is a rather sick optimisation to work around markdown's speed issues.
Now inline inserts a small, dummy div, allows markdown to quickly render
the actual page content, then replaces the dummy with the actual inlined
pages later.
Results: Rendering just a recentchanges page, with diffs included, dropped
from 4.5 seconds to 2.7 seconds on my laptop. Building the entire wiki
dropped from 46.6 seconds to 39.5 seconds.
(It would be better if inline were a *post*-processor directive.)
xml rpc only allows functions to return a single value, no lists. So getargv
needs to return a list reference, which means that the caller will see an xml
rpc array.
This works around a perl crasher bug, and also avoids bloating pages
with enormous diffs.
rcs_recentchanges modified to return a list in an array context.
If a diff of the firsst commit in a git repo was requested, it would fail and
print to stderr since first^ isn't valid. Using git show will always work.
set to the destination page. This avoids need for hacks to munge the urls
in preview mode, which fixes several bugs.
* Several destpage fixes in plugins.
Adds an optional xrds-location parameter to the openid meta handler,
which allows for XRDS delegation.
A good document on XRDS is
http://www.windley.com/archives/2007/05/using_xrds.shtml
Signed-off-by: martin f. krafft <madduck@madduck.net>
correct, and it's certianly not correct now, since the wiki is locked
before rcs_commit is ever called, and should not be unlocked by
rcs_commit either.
Markdown is such a splintered mess.. The current debian package provides
only Text::Markdown::Markdown, while all versions of Text::Markdown support
Text::Markdown::markdown, and old versions also support the capitalised version,
while new ones don't.
It's getting to the point where `grep /markdown/i %symbol_table` is the only
sane way to figure out what function to call..
* rcs_diff is a new function that rcs modules should implement.
* Implemented rcs_diff for git, svn, and tla (tla version untested).
Mercurial and monotone still todo.
Add special handling for <meta name="robots" ...> which needs not be
scrubbed as it's harmless.
Signed-off-by: martin f. krafft <madduck@madduck.net>
(cherry picked from commit b15d0299a7f7b147e89d8a202d6cca1c21491af2)
A new regexp fixes this bug:
http://ikiwiki.info/bugs/No_link_for_blog_items_when_filename_contains_a_colon/
I traced this down to htmlscrubber. If disabled,
it works. If enabled, then $safe_url_regexp
determines the URL unsafe because of the colon and
hence removes the src attribute.
Digging into this, I find that RFC 3986 pretty
much discourages colons in filenames:
"""
A path segment that contains a colon character
(e.g., "this:that") cannot be used as the first
segment of a relative-path reference, as it would
be mistaken for a scheme name. Such a segment must
be preceded by a dot-segment (e.g., "./this:that")
to make a relative- path reference.
"""
on the other hand, with usedirs, any link to
another page will be prepended by ../ anyway, so
that makes them okay again.
The solution still seems not to use colons.
In any case, htmlscrubber should get a new regexp,
courtesy of dato.
I have tested and verified this.
Signed-off-by: martin f. krafft <madduck@madduck.net>
As was already done for linkfication, links generated in a prevew page
are relative to the top of the wiki, so it has to be told that the destpage
is there.
I was using "" to indicate this, but that may confuse some preprocessor
plugins, which treat parameters with an empry value specially (sparkline is one
such). Instead, use "/", which is more accurate anyway and works just as well.
which forced a scan of the page to make available metadata that
appeared after the inline directive. Problem is that scan made it forget
about any other files rendered due to the page. The scan also turns out
to be unnecessary now, since meta persistently stores state and it's
always available. So it was just removed.
(as preserving the full list across preview would be tricky). Userdirs
were still being offered as an option there, remove them.
* Fix a bug where user A created a page concurrently with user B, and
when B previewed it would redirect B to A's new page, losing B's work.
Instead, don't redirect and let conflict handling resolve it.
containing ikiwiki.cgi, but this should not change the urls to the style
sheets etc. Add a new forcebareurl parameter to misctemplate to allow
it to do that.
of XML::RPC's default of us-ascii. Allows interoperation with
python's xmlrpc library, which threw invalid encoding exceptions and
caused the rst plugin to hang.
Some browsers interpret about: URIs like a limited version of data:
URIs. In particular, some versions of Internet Explorer interpret
arbitrary HTML content in about: URIs.
just avoid actually writing the files. This is necessary because ikiwiki
saves state after a preview (in case it actually *did* write files),
and if will_render isn't called its security checks will get upset
when the page is saved. Thanks to Edward Betts for his help tracking this
tricky bug down.
- On commits, replace "mtn sync" bidirectional with "mtn push" single
direction. No need to pull changes when doing a commit. mtn sync
is still called in rcs_update.
- Support for viewing differences via patches using viewmtn.
Now aggregation will not lock the wiki. Any changes made during aggregaton are
merged in with the changed state accumulated while aggregating. A separate
lock file prevents multiple concurrent aggregators. Garbage collection
of orphaned guids is much improved. loadstate() is only called once
per process, so tricky support for reloading wiki state is not needed.
(Tested fairly thuroughly.)
since this leads to too many problems with web caching, especially with
inlined pages. Properly solving this would involve tracking every page
that contributes to a page's content and using the youngest of them all,
as well as special cases for things like the version plugin, and it's just
too complex to do.
license, and copyright. This can be used to create custom RecentChanges.
* meta: To support the pagespec functions, metadata about pages has to be
retained as pagestate.
* Fix encoding bug when pagestate values contained spaces.
I kept it to a simple global configuration, rather than using the
preprocessor directive for recentchanges, because that had chicken and egg
problems and seemed overcomplicated. This should work reasonably well,
though it would be good to add some more metadata so that more customised
recentchanges pages can be made.
This makes it a lot quicker to deal with lots of recentchanges pages
appearing and disappearing. It avoids needing to clutter up pagespecs with
exclusions for those pages, by making normal pagespecs not match them.
This is important to do because until will_render is called, ikiwiki doesn't
know that the page exists. This avoids recentchanges re-writing every change
page every run.
If a page type starts with an underscore, hide it from the list of page types
in the edit form, and don't allow editing pages of that type. This allows
for plugins to add page types for internal use.
gettext choked on a Unicode apostrophe in the aggregate plugin, which
appeared in a new error message in commit
4f872b5633. Replace it with an ASCII
apostrophe.
the code, since that process can change internal state as needed, and
it will automatically be cleaned up for the parent process, which proceeds
to render the changes.
The -c option to git log/diff-tree produces "merged" diffs with a
different format from normal ones. However, the existing diff-tree
parser only accepted non-merged diff lines. Merged diff lines caused
the parser to get out of sync. This patch adds a full parser for diffs
with any number of parents. See the "DIFF FORMAT FOR MERGES" section in
the git-diff-tree man page for more information.
Signed-off-by: Brian Downing <bdowning@lavos.net>
remove the enclosing paragraph and newline markdown wraps it in.
This allows removing several hacks around this markdown behavior from
other plugins that htmlize fragements of pages.
returned (and not run in some cases) rather than the plugins directly
forcing a user to log in.
* opendiscussion: allow editing of the toplevel discussion page,
and, indirectly, allow creating new discussion pages.
* decode_form_utf8 only fixed the utf-8 encoding for fields that were
registered at the time it was called, which was before the
formbuilder_setup hook. Fields added by the hook didn't get decoded.
But it can't be put after the hook either, since plugins using the hook
need to be able to use form values. To fix this dilemma, it's been changed
to a decode_cgi_utf8, which is called on the cgi query object, before the
form is set up, and decodes *all* cgi parameters.
the needsbuild hook. This resulted in feeds not being removed when pages
were updated, and probably other bugs.
* aggregate: Avoid uninitialised value warning when removing a feed that
has an expired guid.
links required meta to be run during scan, which complicated its data
storage, since it had to clear data stored during the scan pass to avoid
duplicating it during the normal preprocessing pass.
* If you used "meta link", you should switch to either "meta openid" (for
openid delegations), or tags (for internal, invisible links). I assume
that nobody really used "meta link" for external, non-openid links, since
the htmlscrubber ate those. (Tell me differently and I'll consider bringing
back that support.)
* meta: Improved data storage.
* meta: Drop the hackish filter hook that was used to clear
stored data before preprocessing, this hack was ugly, and broken (cf:
liw's disappearing openids).
* aggregate: Convert filter hook to a needsbuild hook.
inserting them into the html template. This ensures that markdown
acts on them, even if the value is expanded inside a block-level html
element in the html template. Closes: #454058
* Use a div in the note template rather than a span.
so that more than one plugin can use this hook.
I believe this is a safe change, since only passwordauth uses this hook.
(If some other plugin already used it, it would have broken passwordauth!)
It would be better if it were a formbuilder hook. But the formbuilder hook
is wacked.. I may need to change how that hook works, which would mean
changing the only current user of it, passwordauth).
and forces rebuilds of the pages that contain calendars. So
running ikiwiki --refresh at midnight is now enough, no need for a full
wiki rebuild each midnight.
* calendar: Work around block html parsing bug in markdown 1.0.1 by
enclosing the calendar in an extra div.
which has been reported to cause encoding problems (though I haven't
reproduced them), just catch a failure of markdown, and retry.
(The crazy perl bug magically disappears on the retry.)
Closes: #449379
to be created owned by some group other than the default. Useful
then there's a shared repository with access controlled by a group,
to let ikiwiki run setgid to that group.
* ikiwiki-mass-rebuild: Run build with the user in all their groups.
page name to be expired and reused for several distinct guids. When this
happened, the expiry code counted each past guid that had used that page
name as a currently existing page, and thus expired too many pages.
* Reformat calendar plugin to ikiwiki conventions.
* The calendar plugin made *every* page depend on every other page,
which seemed a wee tiny little bit overkill. Fixed the dependency
calculations (I hope.)
* Removed manual ctime statting code, and just have the calendar plugin use
%pagectime.
ikiwiki via XML RPC. This should be much faster than the old plugin that
had to fork python for every rst page render. Note that if you use
the rst plugin, you now need to have the RPC::XML perl module installed.
showed up where a web edit that added a page caused a near-concurrent
web edit to fail in will_render. While it would be hard to reproduce this,
my analysis is that the failing cgi started first, loaded the index file
(prior to locking) then the other cgi created the new page and rendered
it, and then the failing cgi choked on the new file when _it_ tried to
render it. Ensuring that the index file is loaded after taking the lock
will avoid this bug.
files in some situations, and this is appropriate in some cases, such as
the teximg plugin's error log file.
Such files will be automatically cleaned up at an appopriate later time.
are not included in the map. Include special styling for such pages.
* map: Remove common prefixes and don't over-indent.
* Add class option to htmllink().
* table plugin: The previous version broke WikiLinks inside quoted values.
Fix this by linkifying CSV data after parsing it, while DSV data is still
linkified before parsing.
When looking for git commits that affect the wiki, only include changes
that affect the ikiwiki source directory. If that is not the top-level
directory in this git repository, strip off the prefix as given by
`git-rev-parse --show-prefix` from all names reported by git-log.
Patch by Jamey Sharp <jamey@minilop.net>.
in the wikilink looked like a table field separator. Avoid this ambiguity
by linkifying the data before parsing it as a table.
* Turn on allow_loose_quotes in the table plugin's Text::CSV object,
so that links from wikilinks don't confuse the parser.
* Plugins can add new directories to the search path with the add_underlay
function.
* Split out smiley underlay files into a separate underlay, so if the plugin
isn't used, the wiki isn't bloated with all those files.
calendar, and youtube. Normally, the htmlsanitiser eats these since they
use unsafe tags, the embed plugin overrides it for trusted sites.
* The googlecalendar plugin is now deprecated, and will be removed
eventually. Please switch to using the embed plugin.
- add a title to the editpage form;
- pass a reference to the list of buttons to the formbuilder_setup
hooks, so we can add ours;
- relax asumption about the possible submit values (use "Save Page"
explicitly);
- de-hardcode the submit buttons from the editpage template
(This was needed for compatability with a bug in CGI::FormBuilder
3.0401, but ikiwiki already needs a newer version.)
* Pass buttons to all other formbuilder_setup hooks too.
atom feeds, and also changing the publication time for a feed to the
newest modiciation time (was newest creation time).
* The patch also adds dcterms:creator to rss items that have a known author.
* Use type= not style= in html for alternate stylesheets, which is more
correct (but in my testing both epiphany and iceweasel work ok with
style=text/css).
* Support building on systems that lack asprintf.
* mercurial getctime is currently broken, apparently by some change in
mercurial version 0.9.4. Turn the failing test case into a TODO test case.
most web sites serve ikiwiki xhtml files as text/html and mozilla browsers
get confused. So it's best for ikiwiki to follow the compatability
recommendations in appendix C of the XHTML spec. Closes: #432045
This turns out to have occured if the cgi wrapper was created by an
ikiwiki invocation that included --rebuild. Thanks to Carl Worth for
tracking that down.
ESCAPE=HTML for titles in the templates for these feeds, and instead
escape the title going in to the template. Previously, the title was
sometimes double-escaped in a feed (if set via meta title), and sometimes
not (if set from the page filename).
* In the meta plugin, when a title is set, encode the html entities in it
numerically. This works better in the current landscape of a rss spec that
doesn't specify encoding, and variously broken feed consumers, according
to <http://www.rssboard.org/rss-profile#data-types-characterdata>.
old files.
* Change where the img plugin puts scaled images. It's better to make the
scaled images subpages of the page that embeds them, rather than putting
them alongside the original image, since if two pages scale the same image
the same way, this prevents complications in dealing with two pages
creating the same file. The move will be handled transparently, though you
might want to rebuild your wiki to make it occur in one step.
until the wiki is building and already locked, unless it's aggregating.
When aggregating, it does not wait for the lock if it cannot get it, and
instead exits, to prevent aggregating processes from piling up.
and style sheet updates, and unless you're using customised versions,
you'll want to rebuild wikis on upgrade to this version to avoid
inconsistencies.
* Allow WIKINAME to to used in footers, as an example of something to put
there.
passwordauth page to the basewiki describing password
authentication; like openid, it uses conditional to check which
forms of authentication the wiki allows. Add conditional cross-
links between the openid and passwordauth pages, to help the user
understand how they can log in.
using the mercurial backend. Not 100% sure why it failed w/o the full
path, but this still passes the test suite, and indeed, is how the test
suite calls hg add.
(Get a good message when a PageSpec fails due to a negated success by
creating success objects with a reason string, which morph into failure
objects when negated.)
scalar context, evaluates to a reason why the match failed.
* Add testpagespec plugin, which might be useful to see why a pagespec isn't
matching something.
for extended pagespecs. The old calling convention will still work for
back-compat for now.
* The calling convention for functions in the IkiWiki::PageSpec namespace
has changed so they are passed named parameters.
* Plugin interface version increased to 2.00 since I don't anticipate any
more interface changes before 2.0.
on and supported creating it (especially Tumov). This adds a "usedirs"
option that makes ikiwiki use foo/index.html instead of foo.html as
output page names. It is not yet enabled by default.
* Renamed %oldpagemtime to a more accurately named %pagemtime and fix it to
actually store pages' mtimes.
* Add "mtime" sort parameter to inline plugin.
that given link points based on the page doing the linking. Note that this
could make such PageSpecs match different things than before, if you
relied on the old behavior of them only matching the raw link text.
* This required changing the match_* interface, adding a third parameter.
* Allow link() PageSpecs to match relative, as is allowed with globs.a
* Add postform option to inline plugin.
* Add an bug tracker to the softwaresite example.
plugins's support for inserting html link and meta tags. Now such content
is passed through the htmlscrubber like everything else.
* Unfortunatly, that means that some valid uses of those tags are no longer
usable, and special case methods needed to be added for including
stylesheets, and for doing openid delegation. If you use either of these
in your wiki, it will need to be modified. See the meta plugin docs
for details.
were titlepage escaped in the urls, and then doubly escaped by the CGI
when editing. To fix this, I removed the titlepage escaping in the edit
urls.
* That means that *every edit link* on the wiki is potentially changed.
Rebuilding wikis on upgrade to this version therefore necessary; enabled
that in postinst.
since it ended up being double-escaped. Instead, just remove slashes.
* Fix some nasty issues with page name escaping during previewing
(introduced in 1.44).
previous ugly hack used to avoid writing rss feeds in previews.
* Fix the img plugin to avoid overwriting images in previews. Instead it
does all the work to make sure the resizing works, and dummys up a resized
image using width and height attributes.
* Also fixes img preview display, the links were wrong in preview before.
commit hook, it was possible for one CGI to race another one and "win"
the commit of both their files. This race has been fixed by adding a new
commitlock, which when locked by the CGI, disables the commit hook
(except for commit mails). The CGI then takes care of the updates the
commit hook would have done.
parameters remain the same, but additional options are now passed in using
named parameters.
* Change plugin interface version to 1.02 to reflect this change.
* Add a new anchor option to htmllink. Thanks Ben for the idea.
* Support anchors in wikilinks.
* Add a "more" plugin based on one contributed by Ben to allow implementing
those dreaded "Read more" links in blogs.
including out of disk space situations. ikiwiki should never leave
truncated files, and if the error occurs during a web-based file edit,
the user will be given an opportunity to retry.
Inspired by the many ways Moin Moin destroys itself when out of disk. :-)
* Fix syslogging of errors.
* Add a "conditional" plugin, which allows displaying text if a condition
is true. It is enabled by default so conditional can be used in the
basewiki.
* Use conditionals in the template for plugins, so that plugin pages
say if they're currently enabled or not, and in various other places
in the wiki.
non-page format files in the wiki. To exploit this, the file already had
to exist in the wiki, and the web user would need to somehow use the web
based editor to replace it with malicious content.
(Sorry Josh, this means you can't edit style.css directly anymore,
although I do appreciate your fixes, actually..)
edited.
* Move code forcing signing before edit to a new "signinedit" plugin, and
code checking for locked pages into a new "lockedit" plugin. Both are
enabled by default.
* Remove the anonok config setting. This is now implemented by a new
"anonok" plugin. Anyone with a wiki allowing anonymous edits should
change their configs to enable this new plugin.
* Add an opendiscussion plugin that allows anonymous users to edit
discussion pages, on a wiki that is otherwise wouldn't allow it.
* Lots of CGI code reorg and cleanup.
* Fix code to make absolute urls for rss feeds, was missing some urls.
* Fix double-escaping of html entities in titles etc in rss feeds
that occured if escaped characters were present in the page filename.
manipulate.
* Only exclude rss and atom files from processing if the inline plugin
is enabled and that feed type is enabled. Else it's just a copyable file
type.
* Move rss and atom option handling code into the inline plugin.
* Applied a rather old patch from Recai to fix the "pruning is too strict"
issue. Now you can have wiki source directories inside dotdirs and the
like, if you want.
template won't work with CGI::FormBuilder 3.0401, so disable it for now.
* CGI::FormBuilder 3.0401 seems to work ok now with ikiwiki, although
there might still be bugs lurking..
* Introduce the nicebundle. This is a kind of plugin, that just enables
many other plugins. It's an easy way to boost ikiwiki from its default,
basic wiki, to a full-featured wiki, without manually picking the right
set of plugins. New plugins will be added to the nicebundle from time to
time.
* Add a test suite for the svn backend.
* Daemonize before sending RPC pings, since that can take a while
and/or hang.
* Daemonize before sending commit mails, as that can also take a long
time/hang if the mail server is unhappy.
* Factor out commit mail sending code into new function.
nothing more spohisticated will be needed.
* Add formbuilder_setup and formbuilder hooks.
* Split out a passwordauth module, that holds all the traditional password
based authentication etc code. It's enabled by default, but can be disabled
if you want only openid or some other auth method.
* Make the openid plugin support the callbacks from myopenid.com via its
affiliate program.
* Change how post signin actions are propigated through the signin process;
they're now stored in the session.
* Web commits by OpenID users will record the full OpenID url for the user,
but in recentchanges, these urls will be converted to a simplified display
form+link.
* Modified svn, git, tla backends to recognise such web commits.
FORM-SUBMIT unusable on customised formbuilder templates. For now,
hardcode the submit buttons in editpage.tmpl instead of using the
template variable, which is ok, since the buttons are static.
* Use precalculated backlinks info when determining if files need an update
due to a page they link to being added/removed. Mostly significant if
there are lots of pages.
* Remove duplicate link info when saving index. In some cases it could
pile up rather badly. (Probably not the best way to deal with this
problem.)
8x.
* Add "scan" parameter to hook(), which is used to make the hook be called
during the scanning pass, as well as the render pass. The meta and tag
plugins need to use the new scan parameter, so will any others that modify
%links.
* Now that links are calculated in a separate pass, it can also
precalculate backlinks in one pass, which is O(N^2) instead of the
previous code that was O(N^3). A very nice speedup for wikis with lots
(thousands) of pages.
instead of over and over. This is up to 8 times faster than before!
(This could have introduced some subtle bugs, so it needs to be tested
extensively.)
an exception for the wiki's toplevel index page, which will still use the
wikiname as the feed title.
* Sanitize possibly problimatic characters out of the polygen grammar names,
just in case. Should not be exploitable anyway, since it only tries to run
polygen after finding the specified grammar file.