It was calling format hooks for each comment on the page.
When relativedate is enabled, that made it insert <script> tags
for each comment. And the browser loaded the same script over and over,
which was slow on its own. But that was nothing compared to running
the onload even over and over.. especially since the hook system
added a new call to the hook each time it loaded.
For a page with 10 comments, that caused the relativedate DOM parsing
code to run 1000 times, I think. Anyway, it was sloow. Now it runs once.
if suitable alternate text is unknown, then it should not be given.
empty alt text is suitable mainly for purely decorative images.
(cherry picked from commit 3cd7f67f0cf894f4fd5ba16f68e82e4f7bdbfdc5)
Always pass the full (modified) content in `content` named parameter. When the
user edits an existing wiki page, also pass a `diff` named parameter, which
includes only the lines that they added to the page, or modified.
Signed-off-by: intrigeri <intrigeri@boum.org>
Some aggregators, like Planet, sort by mtime rather than ctime. This
means that posts with modified content come to the top (which seems odd
to me, but is presumably what the aggregator's author or operator
wants), but it also means that posts with insignificant edits (like
adding tags) come to the top too. Atom defines <updated> to be the date
of the last *significant* change, so it's fine that ikiwiki defaults to
using the mtime, but it would be good to have a way for the author to
say "that edit was insignificant, don't use that mtime".
That resulted in double encoded display when using perl's stub
readline module. Apparently that module unconditionally upgrades
text to utf8, in a quite braindead way.
(Term::ReadLine::Gnu::Perl worked ok.)
Use mtn for monontone and hg for mercurial. The long names cause ugly
formatting in recentchanges, which has CSS that only allows a few
characters for the commit type column.
All meta titles are first extracted at scan time, i.e. before we turn
PO files back into translated markdown; escaping of double-quotes in
PO files breaks the meta plugin's parsing enough to save ugly titles
to %pagestate at this time.
Then, at render time, every page's passes on row through the Great
Rendering Chain (filter->preprocess->linkify->htmlize), and the meta
plugin's preprocess hook is this time in a position to correctly
extract the titles from slave pages.
This is, unfortunately, too late: if the page A, linking to the page B,
is rendered before B, it will display the wrongly-extracted meta title
as the link text to B.
On the one hand, such a corner case only happens on rebuild: on
refresh, every rendered page is fixed to contain correct meta titles.
On the other hand, it can take some time to get every page fixed.
We therefore re-render every rendered page after a rebuild to fix them
at once. As this more or less doubles the time needed to rebuild the
wiki, we do so only when really needed.
Signed-off-by: intrigeri <intrigeri@boum.org>
It will set up an ikiwiki instance tuned for use in blogging.
As part of this change, move the example sites into /usr/share/ikiwiki so
they are available even if docs are not installed.
Asking for only the head worked in my tests, but I've found a site where it
didn't -- apparently ikiwiki didn't get a chance to do or finish the
refresh when HEADed. Getting the whole url, waiting for ikiwiki to finish,
avoided the update problem.
* repolist: New plugin to support the rel=vcs-* microformat.
* goodstuff: Include repolist by default. (But it does nothing until
configured with the repository locations.)
Form validation works, but after trying to save invalid PO content, the user is
brought back to the page he/she was editing, without any single clue to explain
why it was not saved. The dedicated cansave hook is thus necessary.
Signed-off-by: intrigeri <intrigeri@boum.org>
This has to be done after the rename/remove plugins have added
their buttons, so we set this hook to be run last.
The canrename/canremove hooks already ensure this is forbidden
at the backend level, so this is only UI sugar.
Signed-off-by: intrigeri <intrigeri@boum.org>
The main reason to do so is to bypass the "favor the type of linking page on
page creation" logic, which is unsuitable when a broken link is clicked on
a slave (PO) page.
Signed-off-by: intrigeri <intrigeri@boum.org>
This is not needed by the use I'm doing of it, but seems more consistent to me.
Future users of this hook may need this data to make their mind.
Signed-off-by: intrigeri <intrigeri@boum.org>
... so that nicepagetitle hook's effects, such as translation status displayed
in links, are updated when the linked page changes.
The replacement of 'my %backlinks' with 'our %backlinks' in Render.pm made this
work: previously, every postscan hook was called with an almost empty
%backlinks, which defeated all my attempts to implement this feature.
This feature hits performance a bit. Its cost was quite small in my real-world
use-cases (a few percents bigger refresh time), but could be bigger in worst
cases. Time will tell.
NB: this hack could also be used by my meta branch. It may even be a ikiwiki
optional feature.
Signed-off-by: intrigeri <intrigeri@boum.org>
Thanks to the new rename hook behaviour, the whole renaming work is now done
by the rename plugin, and we don't need to remember which pages were renamed.
inline has a format hook that is an optimisation hack. Until this hook
runs, the inlined content is not present on the page. This can prevent
other format hooks, that process that content, from acting on inlined
content. In bug ##509710, we discovered this happened commonly for the
embed plugin, but it could in theory happen for many other plugins (color,
cutpaste, etc) that use format to fill in special html after sanitization.
The ordering was essentially random (hash key order). That's kinda a good
thing, because hooks should be independent of other hooks and able to run
in any order. But for things like inline, that just doesn't work.
To fix the immediate problem, let's make hooks able to be registered as
running "first". There was already the ability to make them run "last".
Now, this simple first/middle/last ordering is obviously not going to work
if a lot of things need to run first, or last, since then we'll be back to
being unable to specify ordering inside those sets. But before worrying about
that too much, and considering dependency ordering, etc, observe how few
plugins use last ordering: Exactly one needs it. And, so far, exactly one
needs first ordering. So for now, KISS.
Another implementation note: I could have sorted the plugins with
first/last/middle as the primary key, and plugin name secondary, to get a
guaranteed stable order. Instead, I chose to preserve hash order. Two
opposing things pulled me toward that decision:
1. Since has order is randomish, it will ensure that no accidental
ordering assumptions are made.
2. Assume for a minute that ordering matters a lot more than expected.
Drastically changing the order a particular configuration uses could
result in a lot of subtle bugs cropping up. (I hope this assumption is
false, partly due to #1, but can't rule it out.)
People seem to be able to expect to enter www.foo.com and get away with it.
The resulting my.wiki/www.foo.com link was not ideal.
To fix it, use URI::Heuristic to expand such things into a real url. It
even looks up hostnames in the DNS if necessary.
A new ikiwiki-transition moveprefs subcommand can pull the old data out of
the userdb and inject it into the setup file.
Note that it leaves the old values behind in the userdb too. I did this
because I didn't want to lose data if it fails writing the setup file for
some reason, and the old data in the userdb will only use a small amount of
space. Running the command multiple times will mostly not change anything.
This leads to better display for OpenIDs like smcv.pseudorandom.co.uk
and thm.id.fedoraproject.org (to take a couple of examples from the
IkiWiki commit history).
None of the comment state needs to be stored through the a later run of
ikiwiki, so move it all from pagestate to a more transient storage.
This is assuming that we'll never want to add pagespecs to search against
the comment state. Pagespecs like author() are why the meta plugin does
store its meta data in pagestate -- the data can be needed later to match
against.
The thinking here is that having both a Discussion page and comments for
the same page is redundant, and certianly not what you want if you enable
comments for a page. At first I considered making configurable via pagespec
what pages got discussion links. But that would mean testing a new pagespec
for every page, and a redundant config setting to keep in sync. So intead,
take a lead from my previous change to make inlined pages have a comments
link, and change the discussion link at the top of regular pages to link to
their comments.
(Implementation is a bit optimised to avoid redundant pagespec checking.)
Jumping to the just posted comment was the imputus, but I killed a number
of birds here.
Added a INLINEPAGE template variable, which can be used to add anchors to
any inline template.
To keep that sufficiently general, it is the full page name, so the
comment anchors and links changed form.
Got rid of the FIXMEd hardcoded html anchor div.
More importantly, the anchor is now to the very top of the comment, not the
text below. So you can see the title, and how it attributes you.
Avoid changing the permalink of pages that are not really comments, but
happen to contain the _comment directive. I think that behavior was a bug,
though not a likely one to occur since _comment should only really be used
on comment pages.
I think it is clearer to have one pagespec that controls all pages with
comments, and a separate pagespec that can be used to close new comments on
a subset of those pages.
Not compacting whitespace is the most important one: now that we run
sanitize hooks on individual posted comments in the comments plugin,
whitespace that is significant to Markdown (but not HTML) is lost.
(cherry picked from commit cb5aaa3cee)
The [[!_comment]] directive is a serialization format, not something for
presentation to users, so we should use the least ambiguous possible
representation.
This delays all comment formatting until the last possible time, allows
us to set metadata without worrying that commenters may be able to evade
it, and means that changes to how a comment is saved can be handled
gracefully. It also gives us somewhere to put the commenter's username
or IP address for later reference.
Not compacting whitespace is the most important one: now that we run
sanitize hooks on individual posted comments in the comments plugin,
whitespace that is significant to Markdown (but not HTML) is lost.
This should ensure that users can't "break out" from the enclosing
<div>, making it impossible to forge comments (assuming htmlscrubber
is enabled, and so is either htmlbalance or htmltidy).
wikilinks are harmless, so we might as well allow them.
Access control for this plugin is a bit odd, since we specifically
don't want to allow comments to be edited - so the check is whether the
user is allowed to edit a deliberately invalid page name,
page/commented/on[smcvpostcomment]. You can put smcvpostcomment(*)
or smcvpostcomment(some/subdir/*) in $config{anonok_pagespec}
or the opposite in $config{locked_pages} to allow "editing" (really
just posting) comments.
I wanted this nearer to the top, but decided to put it after the
add_depends. Reasoning: It's possible with a combinaton of feedpages and
show options to make @list and @feedlist contain completly differing sets
of pages. We want to add_depends all pages in both sets. We could combine
the two lists and add_depends that, but it's slightly more efficient to
defer reducing @feedlist, and add_depends whichever list is longer.
This is a skeleton that does nothing yet.
See the comments in the code for an overview of the issue that arises, due to
the renamepage hook never being called globally.
Signed-off-by: intrigeri <intrigeri@boum.org>
Not implemented yet, 'cos the renamepage hook has to come first.
Else translations would be deleted on rename, what a shame.
Signed-off-by: intrigeri <intrigeri@boum.org>
... instead of already existing ones.
This fixes the "missing otherlanguages links on master pages just created via
the CGI" bug.
Signed-off-by: intrigeri <intrigeri@boum.org>
And enjoy a 10% rebuild time enhancement on a complex wiki full of maps and
other pseudo-dynamic content, with some other costly plugins enabled. So it
could well mean 20% on a more usual wiki.
Signed-off-by: intrigeri <intrigeri@boum.org>
This way, the po plugin will not appropriate PO files it is not responsible for,
and PO files existing before this plugin was enabled can coexist peacefully with
our own ones.
Signed-off-by: intrigeri <intrigeri@boum.org>
(I just removed in istranslation and _istranslation the dependency on
istranslatable... which broke things in a subtle way, hard to see at the first
glance.)
Signed-off-by: intrigeri <intrigeri@boum.org>
This is necessary so that things that fork to the background,
like pinger, and inline ping, don't block other cgis from running.
Note that websetup also calls unlockwiki, before refreshing / rebuilding
the wiki. It makes perfect sense for that not to block other cgis.
Fixed by making the cgi wrapper wait on a cgilock.
If you had to set apache's MaxClients low to avoid ikiwiki thrashing
your server, you can now turn it up to a high value.
The downside to this is that a cgi call that doesn't need to call lockwiki
will be serialised by this so only one can run at a time. (For example,
do=search.) There are few such calls, and all of them call loadindex,
so each still eats gobs of memory, so serialising them still seems ok.
It has grown up incrementally and new helper functions were added right in the
middle of the hooks, most often near the place they were used, which is
practical when doing initial development, but quite ugly afterwards, when helper
functions are useful to separate logic and implementation details.
Today's refactoring commits have brought the code to a much more maintainable
state, IMHO.
Signed-off-by: intrigeri <intrigeri@boum.org>
This is not needed now that tagpage returns a page name starting with a
slash.
(Also fixes a minor bug that the edit links started with double slashes due
to the hack.)
It is now more elegant IMHO, and the output is now sorted according to the
language name (instead of code).
Signed-off-by: intrigeri <intrigeri@boum.org>
The very same code was repeated at dozens of places.
NB: the real work is now done is _istranslation(), which is memoized,
so the additional function calls overhead should be compensated.
Signed-off-by: intrigeri <intrigeri@boum.org>
... to prevent the use of Encode::Guess::guess_encoding() in
Locale::Po4a::Transtractor (just a minor security measure, dependent on po4a
internals, but we have no reason to think Encode::Guess is not safe).
Signed-off-by: intrigeri <intrigeri@boum.org>
... by wrapping IkiWiki::urlto in order to workaround hard-coded
/index.$config{htmlext}, which is wrong when usedirs=0 and po_link_to=current
and translatable homepage
Signed-off-by: intrigeri <intrigeri@boum.org>
Now use the change hook to update these files, check them into VCS, and trigger
IkiWiki::refresh as needed. The needsbuild hook's help was required to prevent
infinite looping.
This more rigorous way of doing this fixes recentchanges (that was previously
not updated in some cases), and probably is a better long-term solution than the
two previously tested ones.
Signed-off-by: intrigeri <intrigeri@boum.org>
warnings are displayed if it is set to an invalid or incompatible value
(e.g. po_link_to=negotiated and disabled usedirs)
Signed-off-by: intrigeri <intrigeri@boum.org>
Only change of note is quoting some strings in a regexp, just in case
(also avoids the . matching any character!)
Mostly whitespace changes of no consequence.
... in a way compatible with various File::Temp versions.
The result is far from being perfect (see comments in the code for details),
but it does work.
Signed-off-by: intrigeri <intrigeri@boum.org>
manually triggering IkiWiki::refresh() was at least dubious, and more or less
buggy (it randomly broke the whole backlinks feature); thinking a bit more to
add the necessary bits to @needsbuild seems like a better way. don't play with
ikiwiki's internals if not absolutely needed.
Signed-off-by: intrigeri <intrigeri@boum.org>
As a trick, use editcontent hook to mark the page as unfiltered, to force our
filter() sub's to proceed again.
Signed-off-by: intrigeri <intrigeri@boum.org>
This reverts commits d9b9022c13 and
39d44d443d. This functionality should now be
achieved using the new inject() function.
Signed-off-by: intrigeri <intrigeri@boum.org>
It makes some test cases cry once every two tries; this may be related to the
artificial way the testsuite is run, or not. In the meantime, stop memoizing
this function.
Signed-off-by: intrigeri <intrigeri@boum.org>
This speeds up web commits by 1/4th of a second or so, since perl does
not have to start up for the post commit hook.
perl's locking is completly FuBar, since it's impossible to tell what perl
flock() really does, and thus difficult to write code in other languages
that interoperates with perl's locking. (Let alone interoperating with
existing fcntl locking from perl...)
In this particular case, I think I was able to find a way to avoid the
insanity, mostly. The C code does a true flock(2), and if perl is using an
incompatable lock method that does not use the same locking primative at
the kernel level, then the C code's test will fail, and it will go ahead
and run the perl code. Then the perl code's test will test the right thing.
On Debian, at least lately, perl's flock() does a true flock(2), so the
optimisation does work.
Add an inject function, that can be used by plugins that want to replace
one of ikiwiki's functions with their own version. (This is a scary thing
that grubs through the symbol table, and replaces all exported occurances
of a function with the injected version.)
external: RPC functions can be injected to replace exported functions.
Removed the stupid displaytime hook, and use injection instead.
The html links already went there, but internally the links were not
recorded as absolute, which could cause confusing backlinks etc.
For example, with tagbase=tags, if blog/tags/bar existed and blog/foo was
tagged bar, it would link to /tags/bar. But, the link would be recorded
simply as a link to tags/bar, and so later blog/tags/bar would appear to
have the backlink.
Need to use a hook because an exported function cannot be reliably
overridden. The replacement verstion was actually only affecting plugins
loaded after it.
formattime doesn't need a hook, since there's no reason to export it.
... else we can fall into some kind of nasty infinite loop, when two ikiwiki
instances don't store their working copy of the repository at the same place:
every POT file update in one repository would trigger an update of the same POT
file in the others repository, and so on.
Signed-off-by: intrigeri <intrigeri@boum.org>
This is not needed yet, but when newly created POT/PO files are added to
%pagesources and other data structures, we'll need this.
Signed-off-by: intrigeri <intrigeri@boum.org>
Both functions are called very often, and:
- istranslatable has no side effect
- _istranslation is the helper function, without any side effect, for the
istranslation function
Signed-off-by: intrigeri <intrigeri@boum.org>
Only the first filter function call on a given {page,destpage} must convert it
from the PO file, subsequent calls must leave the passed $content unmodified.
Else, preprocessing loops are the rule.
Signed-off-by: intrigeri <intrigeri@boum.org>
Replaced [[!translatable]] directive with po_translatable_pages setting.
Moved istranslatable/istranslation code to helper functions leaving place for
future caching and/or memoization. The PageSpec functions still work.
Signed-off-by: intrigeri <intrigeri@boum.org>
Apache's content negotiation transparently redirects any old URL (page.html) to
the new one, depending on the client preferred language (i.e. a German browser
will be fed with page.html.de). Transition to this naming convention is then
really smooth.
This naming convention allows one to deliberately display the master page, even
if her browser is configured for another language.
Signed-off-by: intrigeri <intrigeri@boum.org>
- renamed po_supported_languages to po_slave_languages
- added po_master_language option, which will soon be useful
Signed-off-by: intrigeri <intrigeri@boum.org>
- .po is a new supported wiki page type
- PO files are rendered verbatim into HTML
- override targetpage to ease Content Negotiation
Signed-off-by: intrigeri <intrigeri@boum.org>
* Add an underlay for javascript, and add ikiwiki.js containing some utility
code.
* toggle: Stop embedding the full toggle code on each page using it, and
move it to toggle.js in the javascript underlay.
Google allows has a nice feature, sitesearch, that allows anyone to
limit search results to a specific site. Obviously, this feature can be
used to provide a search engine for the local ikiwiki site without the
need to install any additional software. Just enable the 'google' plugin
and make sure that --url uses the proper hostname. Thanks to Joey for
helping to get the Perl implementation right.
Whenever the edit form is submitted, but not saved, the page location
select should reduce to the currently selected value. This was only done
when previewing before, but is also needed in order to support the case of
adding an attachment to a page that is just being created.
Before this change, the attachment plugin would get a weird value in
$form->field("page"), that did not reflect the actual page location.
newpagefile.
Note that newpagefile is not used here (or in recentchanges) because
the internal use pages they generate are transient and unlikely to
benefit from being put each in their own subdir.
I noticed that ikiwiki/formatting was beilg rebuilt when any page changed.
This turned out to be because it contained a complex conditional
"enabled(foo) or enabled(bar)", and the conditional plugin did not notice
that this consisted only of enabled() tests, and copied it unchanged into
add_depends. Thus, the page's dependencies were satisfied by any page
change.
The fix is to beef up the parser so that it can handle that and more
complex conditionals, and detect if they consist only of such tests.
To handle this, avoid populating %renderedfiles in preview,
and in expiry, check if the file is in %renderedfiles, if it is
do not delete it since it was saved.
Upgrades to the new index format should be transparent.
The version field is 3, because 1 was the old textual index, 2 was the
pre-versioned format.
This also includes some efficiency improvements to index loading, by
not copying a hash and using a reference.
* htmltidy: Avoid returning undef if tidy fails. Also avoid returning the
untidied content if tidy crashes. In either case, it seems best to tidy
the content to nothing.
* htmltidy: Avoid spewing tidy errors to stderr.
Conflicts:
IkiWiki/Plugin/recentchanges.pm
Note that smcv's approach of using urlto also gets the url right when
redirecting to a non-html file, which is a better approach than my recent
fix to recentchanges
Seems that the problem is that once the \nnn coming from git is converted
to a single character, decode_utf8 decides that this is a standalone
character, and not part of a multibyte utf-8 sequence, and so does nothing.
I tried playing with the utf-8 flag, but that didn't work. Instead, use
decode("utf8"), which doesn't have the same qualms, and successfully
decodes the octets into a utf-8 character.
Rant:
Think for a minute about fact that any and every program that parses git-log,
or git-show, etc output to figure out what files were in a commit needs to
contain this snippet of code, to convert from git-log's wacky output to a
regular character set:
if ($file =~ m/^"(.*)"$/) {
($file=$1) =~ s/\\([0-7]{1,3})/chr(oct($1))/eg;
}
(And it's only that "simple" if you don't care about filenames with
embedded \n or \t or other control characters.)
Does that strike anyone else as putting the parsing and conversion in the
wrong place (ie, in gitweb, ikiwiki, etc, etc)? Doesn't anyone who actually
uses git with utf-8 filenames get a bit pissed off at seeing \xxx\xxx
instead of the utf-8 in git-commit and other output?
I saw this in the wild, apparently a page was not present on disk, but was
in the aggregate db, and not marked as expired either. Not sure how that
happened, but such pages should get marked as expired since they have an
effectively zero ctime.
* edittemplate: Default new page file type to the same type as the template.
(willu)
* edittemplate: Add "silent" parameter. (Willu)
* edittemplate: Link to template, to allow creating it. (Willu)
* goodstuff: Remove otl plugin from the bundle since it needs a significant
external dependency and is not commonly used. If you use otl, make sure
you explicitly enable it now.
* goodstuff: Add more, progress, and table plugins to the bundle.
I don't want to be stuck renameing it later if preprocessor directives are
turned into postprocessor directives. Also, "directives" is shorter and
clearer than "preprocessors".
* teximg: The prefix is configurable, and has changed to not include the
nonstandard mhchem by default. (willu)
* teximg: dvipng is used if available to render images. Its output is
antialiased and better than dvips. If not available, the old dvips+convert
chain will be used. (willu)
* Drop suggests on texlive-science, add suggests on dvipng.
The use of $dummy was not sufficient, because it only stuck around for the
first element after a dummy parent, and was then lost. Instead, use a
$addparent that contains the actual dummy parent, so it can be compared
with the new item to see if we're still under that parent or have moved to
another one.
Its value was being ignored. Some kind of formbuilder bug?
Anyway, prefixing all keys with a section seems like a good idea
generally, in case there's ever overlap.
Had to do this due to one of CGI::FormBuilder's more annoying quirks -- it
loses the value of a checkbox field with only one option, always treating
it as checked.
Move rcs plugin load to loadplugins; move duplicate rcs detection logic out
of individual plugins and into loadplugins. Avoids checkconfig failing when
run twice.
The locked pages configuration is moving to a locked_pages option in the
setup file, and the allowed attachments configuration to
allowed_attachments. The admin prefs page can still be used for these, but
that's depreacted and will only be shown if there's currently a value.
I considered not checking them in, or making the checkin configurable.
However, then they would remain not checked in if edited by a user, which is
probably not desired.
Note that passing undef as the username/ip to rcs_commit_staged may not
result in ideal behavior; the commit may seem to come from "anonymous" with
some revision control systems. Most of them handle it a bit better and just
have it come from whatever user is running the build.
Rather than every monotone rcs_ function calling check_config, just put it
in a checkconfig hook. (But the chdir still needs to be done by every
hook.)
The fix for colons involved adding "./" to some urls. Due to the weird way
inline called urlto, these snuck into feed urls and permalinks. Fix it by
adding an optional third parameter to urlto.
Have to convert link text to page name going in.
And on the way out, need to replace spaces with underscores in the link
text, which is not normally done with titles.
Previously, if a page changed its type but not its mtime
(e.g. mv page.txt page.mdwn), then it would not be rebuilt.
Now, check if the source of a page has changed,
in which case force a rebuild of that page.
(cherry picked from commit b6a3b8a683fed7a7f6d77a5b3f2dfbd14c849843)
Implemented for git and svn so far.
Note that rcs_commit_staged does assume that the rcs has the ability to
"stage" multiple changes for a later commit. Support for this varies, but
all we really care about is staging removals and renames, which, AFAIK, all
modern rcs's support.
can be used to avoid a security check that is a good safe default, but
problimatic overkill in some situations.
I decided to underdocument this, because the option looks ugly, and I don't
want people randomly turning it on because it looks like a good idea. So if
you need it, you'll get an error message mentioning how to fix it.
I think this used to be a fatal error, not just inline error, so I don't
know why it was never noticed, but if a page that an img directive mentions
gets deleted, bestlink() returns a file that no longer exists, and
srcfile() throws an error.
Note that bestlink's behavior of returning a deleted file could be
considered buggy. But, if it's changed to not do that, the page with the img
on it is not updated at all when the file is removed.
This is overkill for delete, since it's only used on Cancel. But it will be
crucial for rename, so as to restore any pending edits after renaming a
page.
The trailer line was a bit complex and ugly;
I think it's better to just put "(web)" after the user
name.
This has a side effect of making web commits with no messages
have a completly empty commit message. Use --cleanup=verbatim
to force git to accept such.
The committer's email address is not used (because leaking email addresses
is not liked by many users). Closes: #451023
A "Web-commit" trailer is added, to allow telling the difference between
web commits and direct commits.
What was really going on is that expanding a smiley modified the string and
reset the match process. Force set pos so it continues on from the expanded
smiley.
Smileys need to be double-escaped to work, since the smiley plugin runs as
a sanitize hook, and markdown helpfully removes one level of escapes first.
There were some bugs in the smiley handling code that made escaped smileys
still be expanded. After unescaping a smiley, it needed to move pos forward
past it or the next pass would expand it.
Also, once the m//g got to the end, it seemed to loop back through and make
one more pass (a difference in perl 5.10's regexp exngine? I observed that
pos was undefined when this happened, so added a `last unless defined pos`.
* Renamed to parentlinks every single variable or function called
pedigree
* Removed the parentlinks function from Render.pm
* Enabled the new parentlinks plugin by default
* Adapted testsuite and documentation to reflate the above facts
Signed-off-by: intrigeri <intrigeri@boum.org>
Usage:
1. Update all pagespecs that use aggregated pages to use internal()
2. ikiwiki-transition aggregateinternal $srcdir $htmlext
(where $srcdir and $htmlext are the srcdir and htmlext options in
your .setup file)
3. Add aggregateinternal to your .setup file
4. Rebuild the wiki
... at least when it's not used in the same template as
PEDIGREE_BUT_TWO_OLDEST (see Known bugs section in pedigree.mdwn for
details)
Signed-off-by: intrigeri <intrigeri@boum.org>
... after having learned a bit of Perl, knocked my head against
Perl references and arrays of hashes, tried to use some nice
functionnal programming constructs - no success - to make things
more generic... I'm back to the roots, with this simple code :)
Signed-off-by: intrigeri <intrigeri@boum.org>
If hardlinks are enabled, it would hardlink files from the underlay. That
was sorta annoying if you tried to edit by hand for some reason, so let's
not. Files that are hardlinked should be rare enough that a few extra stats
won't hurt.
This addresses <http://ikiwiki.info/todo/aggregate_to_internal_pages/>
in a simple way. With this approach, a flag day is required, on which all
users of aggregated pages start to inline them using the internal() pagespec;
after that, the aggregateinternal option can safely be switched on in the
setup file (and the old aggregated pages can be deleted by hand).
This ensures that the same link is reached as is used on pages,
so browsers will know that the link on pages has been visited, and color it
appropriately.
The constructor can fail with a useless error message if module fail to
load. Work around this by evaling it, and checking for failures, and
printing CGI::Session->errstr to get a more useful message.
This is truely horribly disgusting. CGI::tmpFileName, in current perls, is
an undocumented function (which should be a clue..) that takes the original
filename of an uploaded attachment, and returns the name of the tempfile
that CGI has stored it in.
In old perls, though, CGI::tmpFileName does not take a filename. It takes
a key from the object's {'.tmpfiles'} hash. This key is something
crazy like '*Fh::fh00001group' -- apparently the stringification of a
filehandle object.
Just to add to the fun, tmpFileName doesn't take the key, it expects a
refernce to the key. Argh?!
But the fun doesn't stop there, because in perl 5.8, CGI.pm is also broken
in two other ways. The upload() method is supposed to return a filehandle
to the temp file. It doesn't. The param() method is supposed to return
a filehandle to the temp file, that stringifies to the original filename.
It returns just the original filename, no filehandle.
Combine all these bugs, and you end up with this disgusting commit. Since
I have no way to get the filehandle, I *need* to get the tempfile name.
If I had the filehandle, I could probably pass it into tmpFileName, and
it might strigify to the right key name. But I don't, so the only way to
determine the key is to grub through the .tmpfiles hash ourselves.
And finally, one the temp file name is discovered, a filehandle can finally
be obtained by (re)opening it.
I recommend that this commit be reverted when perl 5.8 is a mercifully
faded memory.
I'm really, really, really glad I'm actually being paid for working on
this right now!
* The editpage form now uses the raw page name, not the page title, in its
'page' cgi parameter. Using the title was ambiguous and made it
impossible to tell between some pages, like "foo/bar" and "foo__47__bar",
sometimes causing the wrong page to be edited.
* This change means that some edit links need to be updated.
Force a rebuild on upgrade to this version.
* Above change also allowed really fixing escaped slashes from the blogpost
form.
Currently includes UI, and a few tests of the attachment, as well as the
framework to extend pagespecs to test attachments. Does not actually save
the file yet.
* toc: Revert change in 2.45 that made it run at sanitize time. This breaks
use of toc in a sidebar.
* Call format hooks when generating page previews, thus fixing toc display
there, as well as fixing inlins to again display in page previews, since
it's started using format hooks. This also allows several other things,
like embed, that use format hooks, to work during page preview time.
* Format hooks should not rely on getting an entire html document, as they
will only get the body during page preview.
* toggle: Deal with preview mode when adding javascript.
This change needs libtext-wikicreole-perl (>= 0.05-2).
Also removing custom link function, there's no need for it -
if it is not defined, the unmodified markup will be returned.
Can't sort by titles; the tree building logic requires that the list be
sorted by page name.
Setting linktext => $page is not the same as omitting it entirely. So some
contortions to only set linktext when the page name is not being shown.
This occurred when a plugin, loaded earlier, filled out a template in its
checkconfig, before recentchanges's checkconfig had run. Since such a
template won't be a recentchanges template, just test for the value being
uninitialized and skip processing.
Using the title obscured path info, and made search results look
inconsistent. Since nothing else uses the title like that, it didn't make
sense for search to.
Because the search plugin needed it, also because it's one of the few
plugins that didn't already have it.
I also considered adding it to htmlize, but I really cannot imagine caring
what the destpage is when htmlizing. (I'll probably be poven wrong later.)
Something has changed in CGI.pm in perl 5.10. It used to not care
if STDIN was opened using :utf8, but now it'll mis-encode utf-8 values
when used that way by ikiwiki. Now I have to binmode(STDIN) before
instantiating the CGI object.
In 57bba4dac1, I changed from decoding
CGI::Formbuilder fields to utf-8, to decoding cgi parameters before setting
up the form object. As of perl 5.10, that approach no longer has any effect
(reason unknown). To get correctly encoded values in FormBuilder forms,
they must once again be decoded after the form is set up.
As noted in 57bba4da, this can cause one set of problems for
formbuilder_setup hooks if decode_form_utf8 is called before the hooks, and
a different set if it's called after. To avoid both sets of problems, call
it both before and after. (Only remaining problem is the sheer ugliness and
inefficiency of that..)
I think that these changes will also work with older perl versions, but I
haven't checked.
Also, in the case of the poll plugin, the cgi parameter needs to be
explcitly decoded before it is used to handle utf-8 values. (This may have
always been broken, not sure if it's related to perl 5.10 or not.)
Turns out duplicate index files do not need to be stored when usedirs is in
use, just when it's not. Ikiwiki is quite consistent about using page/ when
usedirs is in use. (The only exception is the search plugin, which needs
fixing.)
This also includes significant code cleanup, removal of a incorrect special
case for empty files, and addition of a workaround for a bug in the amazon
perl module.
The fix involved embedding the session id in the forms, and not allowing the
forms to be submitted if the embedded id does not match the session id.
In the case of the preferences form, if the session id is not embedded,
then the CGI parameters are cleared. This avoids a secondary attack where the
link to the preferences form prefills password or other fields, and
the user hits "submit" without noticing these prefilled values.
In the case of the editpage form, the anonok plugin can allow anyone to edit,
and so I chose not to guard against CSRF attacks against users who are not
logged in. Otherwise, it also embeds the session id and checks it.
For page editing, I assume that the user will notice if content or commit
message is changed because of CGI parameters, and won't blndly hit save page.
So I didn't block those CGI paramters. (It's even possible to use those CGI
parameters, for good, not for evil, I guess..)
The only other CSRF attack I can think of in ikiwiki involves the poll plugin.
It's certianly possible to set up a link that causes the user to unknowingly
vote in a poll. However, the poll plugin is not intended to be used for things
that people would want to attack, since anyone can after all edit the poll page
and fill in any values they like. So this "attack" is ignorable.
tag 473987 +patch
thanks
Hi,
The issue is that we need to convert relative links to absolute
ones for atom and rss feeds -- but there are two types of
relative links. The first kind, relative to the current
document ( href="some/path") is handled correctly. The second
kind of relative url is is relative to the http server
base (href="/semi-abs/path"), and that broke.
It broke because we just prepended the url of the current
document to the href (http://host/path/to/this-doc/ + link),
which gave us, in the first place:
http://host/path/to/this-doc/some/path [correct], and
http://host/path/to/this-doc//semi-abs/path [wrong]
The fix is to calculate the base for the http server (the base of
the wiki does not help, since the base of the wiki can be
different from the base of the http server -- I have, for example,
"url => http://host.name.mine/blog/manoj/"), and prepend that to
the relative references that start with a /.
This has been tested.
Signed-off-by: Manoj Srivastava <srivasta@debian.org>
destpage does not normally need to be worried about when creating other files
as part of the process of rendering a page. Using destpage results in
inlined pages creating two copies of such files. It works to not use destpage
in this case because the inlining page depends on the source page, so if the
source page is modified or deleted the inlining page will be updated.
Instead of using the XML-RPC v2 extension <nil/>, which Perl's
XML::RPC::Parser does not (yet) support (Joey's patch is pending), we
agreed on a sentinel: {'null':''}, that is, a hash with a single key
"null" pointing to the empty string.
The Python proxy automatically converts None appropriately and raises an
exception if a hook function should, by weird coincidence, attempt to
return {'null':''}.
Signed-off-by: martin f. krafft <madduck@madduck.net>
Markdown is slow. Especially if it has to process an enormous page. The
most common enormous page is currently the recentchanges page, which gets
processed a lot, and contains very little actual markdown. Most of it is a
big <div>, which markdown skips ... slowly.
This is a rather sick optimisation to work around markdown's speed issues.
Now inline inserts a small, dummy div, allows markdown to quickly render
the actual page content, then replaces the dummy with the actual inlined
pages later.
Results: Rendering just a recentchanges page, with diffs included, dropped
from 4.5 seconds to 2.7 seconds on my laptop. Building the entire wiki
dropped from 46.6 seconds to 39.5 seconds.
(It would be better if inline were a *post*-processor directive.)
xml rpc only allows functions to return a single value, no lists. So getargv
needs to return a list reference, which means that the caller will see an xml
rpc array.
This works around a perl crasher bug, and also avoids bloating pages
with enormous diffs.
rcs_recentchanges modified to return a list in an array context.
If a diff of the firsst commit in a git repo was requested, it would fail and
print to stderr since first^ isn't valid. Using git show will always work.
set to the destination page. This avoids need for hacks to munge the urls
in preview mode, which fixes several bugs.
* Several destpage fixes in plugins.
Adds an optional xrds-location parameter to the openid meta handler,
which allows for XRDS delegation.
A good document on XRDS is
http://www.windley.com/archives/2007/05/using_xrds.shtml
Signed-off-by: martin f. krafft <madduck@madduck.net>
correct, and it's certianly not correct now, since the wiki is locked
before rcs_commit is ever called, and should not be unlocked by
rcs_commit either.
Markdown is such a splintered mess.. The current debian package provides
only Text::Markdown::Markdown, while all versions of Text::Markdown support
Text::Markdown::markdown, and old versions also support the capitalised version,
while new ones don't.
It's getting to the point where `grep /markdown/i %symbol_table` is the only
sane way to figure out what function to call..
* rcs_diff is a new function that rcs modules should implement.
* Implemented rcs_diff for git, svn, and tla (tla version untested).
Mercurial and monotone still todo.
Add special handling for <meta name="robots" ...> which needs not be
scrubbed as it's harmless.
Signed-off-by: martin f. krafft <madduck@madduck.net>
(cherry picked from commit b15d0299a7f7b147e89d8a202d6cca1c21491af2)
A new regexp fixes this bug:
http://ikiwiki.info/bugs/No_link_for_blog_items_when_filename_contains_a_colon/
I traced this down to htmlscrubber. If disabled,
it works. If enabled, then $safe_url_regexp
determines the URL unsafe because of the colon and
hence removes the src attribute.
Digging into this, I find that RFC 3986 pretty
much discourages colons in filenames:
"""
A path segment that contains a colon character
(e.g., "this:that") cannot be used as the first
segment of a relative-path reference, as it would
be mistaken for a scheme name. Such a segment must
be preceded by a dot-segment (e.g., "./this:that")
to make a relative- path reference.
"""
on the other hand, with usedirs, any link to
another page will be prepended by ../ anyway, so
that makes them okay again.
The solution still seems not to use colons.
In any case, htmlscrubber should get a new regexp,
courtesy of dato.
I have tested and verified this.
Signed-off-by: martin f. krafft <madduck@madduck.net>
As was already done for linkfication, links generated in a prevew page
are relative to the top of the wiki, so it has to be told that the destpage
is there.
I was using "" to indicate this, but that may confuse some preprocessor
plugins, which treat parameters with an empry value specially (sparkline is one
such). Instead, use "/", which is more accurate anyway and works just as well.
which forced a scan of the page to make available metadata that
appeared after the inline directive. Problem is that scan made it forget
about any other files rendered due to the page. The scan also turns out
to be unnecessary now, since meta persistently stores state and it's
always available. So it was just removed.
(as preserving the full list across preview would be tricky). Userdirs
were still being offered as an option there, remove them.
* Fix a bug where user A created a page concurrently with user B, and
when B previewed it would redirect B to A's new page, losing B's work.
Instead, don't redirect and let conflict handling resolve it.
containing ikiwiki.cgi, but this should not change the urls to the style
sheets etc. Add a new forcebareurl parameter to misctemplate to allow
it to do that.
of XML::RPC's default of us-ascii. Allows interoperation with
python's xmlrpc library, which threw invalid encoding exceptions and
caused the rst plugin to hang.
Some browsers interpret about: URIs like a limited version of data:
URIs. In particular, some versions of Internet Explorer interpret
arbitrary HTML content in about: URIs.
just avoid actually writing the files. This is necessary because ikiwiki
saves state after a preview (in case it actually *did* write files),
and if will_render isn't called its security checks will get upset
when the page is saved. Thanks to Edward Betts for his help tracking this
tricky bug down.
- On commits, replace "mtn sync" bidirectional with "mtn push" single
direction. No need to pull changes when doing a commit. mtn sync
is still called in rcs_update.
- Support for viewing differences via patches using viewmtn.
Now aggregation will not lock the wiki. Any changes made during aggregaton are
merged in with the changed state accumulated while aggregating. A separate
lock file prevents multiple concurrent aggregators. Garbage collection
of orphaned guids is much improved. loadstate() is only called once
per process, so tricky support for reloading wiki state is not needed.
(Tested fairly thuroughly.)
since this leads to too many problems with web caching, especially with
inlined pages. Properly solving this would involve tracking every page
that contributes to a page's content and using the youngest of them all,
as well as special cases for things like the version plugin, and it's just
too complex to do.
license, and copyright. This can be used to create custom RecentChanges.
* meta: To support the pagespec functions, metadata about pages has to be
retained as pagestate.
* Fix encoding bug when pagestate values contained spaces.
I kept it to a simple global configuration, rather than using the
preprocessor directive for recentchanges, because that had chicken and egg
problems and seemed overcomplicated. This should work reasonably well,
though it would be good to add some more metadata so that more customised
recentchanges pages can be made.
This makes it a lot quicker to deal with lots of recentchanges pages
appearing and disappearing. It avoids needing to clutter up pagespecs with
exclusions for those pages, by making normal pagespecs not match them.
This is important to do because until will_render is called, ikiwiki doesn't
know that the page exists. This avoids recentchanges re-writing every change
page every run.
If a page type starts with an underscore, hide it from the list of page types
in the edit form, and don't allow editing pages of that type. This allows
for plugins to add page types for internal use.
gettext choked on a Unicode apostrophe in the aggregate plugin, which
appeared in a new error message in commit
4f872b5633. Replace it with an ASCII
apostrophe.
the code, since that process can change internal state as needed, and
it will automatically be cleaned up for the parent process, which proceeds
to render the changes.
The -c option to git log/diff-tree produces "merged" diffs with a
different format from normal ones. However, the existing diff-tree
parser only accepted non-merged diff lines. Merged diff lines caused
the parser to get out of sync. This patch adds a full parser for diffs
with any number of parents. See the "DIFF FORMAT FOR MERGES" section in
the git-diff-tree man page for more information.
Signed-off-by: Brian Downing <bdowning@lavos.net>
remove the enclosing paragraph and newline markdown wraps it in.
This allows removing several hacks around this markdown behavior from
other plugins that htmlize fragements of pages.
returned (and not run in some cases) rather than the plugins directly
forcing a user to log in.
* opendiscussion: allow editing of the toplevel discussion page,
and, indirectly, allow creating new discussion pages.
* decode_form_utf8 only fixed the utf-8 encoding for fields that were
registered at the time it was called, which was before the
formbuilder_setup hook. Fields added by the hook didn't get decoded.
But it can't be put after the hook either, since plugins using the hook
need to be able to use form values. To fix this dilemma, it's been changed
to a decode_cgi_utf8, which is called on the cgi query object, before the
form is set up, and decodes *all* cgi parameters.
the needsbuild hook. This resulted in feeds not being removed when pages
were updated, and probably other bugs.
* aggregate: Avoid uninitialised value warning when removing a feed that
has an expired guid.
links required meta to be run during scan, which complicated its data
storage, since it had to clear data stored during the scan pass to avoid
duplicating it during the normal preprocessing pass.
* If you used "meta link", you should switch to either "meta openid" (for
openid delegations), or tags (for internal, invisible links). I assume
that nobody really used "meta link" for external, non-openid links, since
the htmlscrubber ate those. (Tell me differently and I'll consider bringing
back that support.)
* meta: Improved data storage.
* meta: Drop the hackish filter hook that was used to clear
stored data before preprocessing, this hack was ugly, and broken (cf:
liw's disappearing openids).
* aggregate: Convert filter hook to a needsbuild hook.
inserting them into the html template. This ensures that markdown
acts on them, even if the value is expanded inside a block-level html
element in the html template. Closes: #454058
* Use a div in the note template rather than a span.
so that more than one plugin can use this hook.
I believe this is a safe change, since only passwordauth uses this hook.
(If some other plugin already used it, it would have broken passwordauth!)
It would be better if it were a formbuilder hook. But the formbuilder hook
is wacked.. I may need to change how that hook works, which would mean
changing the only current user of it, passwordauth).
and forces rebuilds of the pages that contain calendars. So
running ikiwiki --refresh at midnight is now enough, no need for a full
wiki rebuild each midnight.
* calendar: Work around block html parsing bug in markdown 1.0.1 by
enclosing the calendar in an extra div.
which has been reported to cause encoding problems (though I haven't
reproduced them), just catch a failure of markdown, and retry.
(The crazy perl bug magically disappears on the retry.)
Closes: #449379
to be created owned by some group other than the default. Useful
then there's a shared repository with access controlled by a group,
to let ikiwiki run setgid to that group.
* ikiwiki-mass-rebuild: Run build with the user in all their groups.
page name to be expired and reused for several distinct guids. When this
happened, the expiry code counted each past guid that had used that page
name as a currently existing page, and thus expired too many pages.
* Reformat calendar plugin to ikiwiki conventions.
* The calendar plugin made *every* page depend on every other page,
which seemed a wee tiny little bit overkill. Fixed the dependency
calculations (I hope.)
* Removed manual ctime statting code, and just have the calendar plugin use
%pagectime.
ikiwiki via XML RPC. This should be much faster than the old plugin that
had to fork python for every rst page render. Note that if you use
the rst plugin, you now need to have the RPC::XML perl module installed.
showed up where a web edit that added a page caused a near-concurrent
web edit to fail in will_render. While it would be hard to reproduce this,
my analysis is that the failing cgi started first, loaded the index file
(prior to locking) then the other cgi created the new page and rendered
it, and then the failing cgi choked on the new file when _it_ tried to
render it. Ensuring that the index file is loaded after taking the lock
will avoid this bug.
files in some situations, and this is appropriate in some cases, such as
the teximg plugin's error log file.
Such files will be automatically cleaned up at an appopriate later time.
are not included in the map. Include special styling for such pages.
* map: Remove common prefixes and don't over-indent.
* Add class option to htmllink().
* table plugin: The previous version broke WikiLinks inside quoted values.
Fix this by linkifying CSV data after parsing it, while DSV data is still
linkified before parsing.
When looking for git commits that affect the wiki, only include changes
that affect the ikiwiki source directory. If that is not the top-level
directory in this git repository, strip off the prefix as given by
`git-rev-parse --show-prefix` from all names reported by git-log.
Patch by Jamey Sharp <jamey@minilop.net>.
in the wikilink looked like a table field separator. Avoid this ambiguity
by linkifying the data before parsing it as a table.
* Turn on allow_loose_quotes in the table plugin's Text::CSV object,
so that links from wikilinks don't confuse the parser.
* Plugins can add new directories to the search path with the add_underlay
function.
* Split out smiley underlay files into a separate underlay, so if the plugin
isn't used, the wiki isn't bloated with all those files.
calendar, and youtube. Normally, the htmlsanitiser eats these since they
use unsafe tags, the embed plugin overrides it for trusted sites.
* The googlecalendar plugin is now deprecated, and will be removed
eventually. Please switch to using the embed plugin.
- add a title to the editpage form;
- pass a reference to the list of buttons to the formbuilder_setup
hooks, so we can add ours;
- relax asumption about the possible submit values (use "Save Page"
explicitly);
- de-hardcode the submit buttons from the editpage template
(This was needed for compatability with a bug in CGI::FormBuilder
3.0401, but ikiwiki already needs a newer version.)
* Pass buttons to all other formbuilder_setup hooks too.
atom feeds, and also changing the publication time for a feed to the
newest modiciation time (was newest creation time).
* The patch also adds dcterms:creator to rss items that have a known author.
* Use type= not style= in html for alternate stylesheets, which is more
correct (but in my testing both epiphany and iceweasel work ok with
style=text/css).
* Support building on systems that lack asprintf.
* mercurial getctime is currently broken, apparently by some change in
mercurial version 0.9.4. Turn the failing test case into a TODO test case.
most web sites serve ikiwiki xhtml files as text/html and mozilla browsers
get confused. So it's best for ikiwiki to follow the compatability
recommendations in appendix C of the XHTML spec. Closes: #432045
This turns out to have occured if the cgi wrapper was created by an
ikiwiki invocation that included --rebuild. Thanks to Carl Worth for
tracking that down.
ESCAPE=HTML for titles in the templates for these feeds, and instead
escape the title going in to the template. Previously, the title was
sometimes double-escaped in a feed (if set via meta title), and sometimes
not (if set from the page filename).
* In the meta plugin, when a title is set, encode the html entities in it
numerically. This works better in the current landscape of a rss spec that
doesn't specify encoding, and variously broken feed consumers, according
to <http://www.rssboard.org/rss-profile#data-types-characterdata>.
old files.
* Change where the img plugin puts scaled images. It's better to make the
scaled images subpages of the page that embeds them, rather than putting
them alongside the original image, since if two pages scale the same image
the same way, this prevents complications in dealing with two pages
creating the same file. The move will be handled transparently, though you
might want to rebuild your wiki to make it occur in one step.
until the wiki is building and already locked, unless it's aggregating.
When aggregating, it does not wait for the lock if it cannot get it, and
instead exits, to prevent aggregating processes from piling up.
and style sheet updates, and unless you're using customised versions,
you'll want to rebuild wikis on upgrade to this version to avoid
inconsistencies.
* Allow WIKINAME to to used in footers, as an example of something to put
there.
passwordauth page to the basewiki describing password
authentication; like openid, it uses conditional to check which
forms of authentication the wiki allows. Add conditional cross-
links between the openid and passwordauth pages, to help the user
understand how they can log in.
using the mercurial backend. Not 100% sure why it failed w/o the full
path, but this still passes the test suite, and indeed, is how the test
suite calls hg add.
(Get a good message when a PageSpec fails due to a negated success by
creating success objects with a reason string, which morph into failure
objects when negated.)
scalar context, evaluates to a reason why the match failed.
* Add testpagespec plugin, which might be useful to see why a pagespec isn't
matching something.
for extended pagespecs. The old calling convention will still work for
back-compat for now.
* The calling convention for functions in the IkiWiki::PageSpec namespace
has changed so they are passed named parameters.
* Plugin interface version increased to 2.00 since I don't anticipate any
more interface changes before 2.0.
on and supported creating it (especially Tumov). This adds a "usedirs"
option that makes ikiwiki use foo/index.html instead of foo.html as
output page names. It is not yet enabled by default.
* Renamed %oldpagemtime to a more accurately named %pagemtime and fix it to
actually store pages' mtimes.
* Add "mtime" sort parameter to inline plugin.
that given link points based on the page doing the linking. Note that this
could make such PageSpecs match different things than before, if you
relied on the old behavior of them only matching the raw link text.
* This required changing the match_* interface, adding a third parameter.
* Allow link() PageSpecs to match relative, as is allowed with globs.a
* Add postform option to inline plugin.
* Add an bug tracker to the softwaresite example.
plugins's support for inserting html link and meta tags. Now such content
is passed through the htmlscrubber like everything else.
* Unfortunatly, that means that some valid uses of those tags are no longer
usable, and special case methods needed to be added for including
stylesheets, and for doing openid delegation. If you use either of these
in your wiki, it will need to be modified. See the meta plugin docs
for details.
were titlepage escaped in the urls, and then doubly escaped by the CGI
when editing. To fix this, I removed the titlepage escaping in the edit
urls.
* That means that *every edit link* on the wiki is potentially changed.
Rebuilding wikis on upgrade to this version therefore necessary; enabled
that in postinst.
since it ended up being double-escaped. Instead, just remove slashes.
* Fix some nasty issues with page name escaping during previewing
(introduced in 1.44).
previous ugly hack used to avoid writing rss feeds in previews.
* Fix the img plugin to avoid overwriting images in previews. Instead it
does all the work to make sure the resizing works, and dummys up a resized
image using width and height attributes.
* Also fixes img preview display, the links were wrong in preview before.
commit hook, it was possible for one CGI to race another one and "win"
the commit of both their files. This race has been fixed by adding a new
commitlock, which when locked by the CGI, disables the commit hook
(except for commit mails). The CGI then takes care of the updates the
commit hook would have done.
parameters remain the same, but additional options are now passed in using
named parameters.
* Change plugin interface version to 1.02 to reflect this change.
* Add a new anchor option to htmllink. Thanks Ben for the idea.
* Support anchors in wikilinks.
* Add a "more" plugin based on one contributed by Ben to allow implementing
those dreaded "Read more" links in blogs.
including out of disk space situations. ikiwiki should never leave
truncated files, and if the error occurs during a web-based file edit,
the user will be given an opportunity to retry.
Inspired by the many ways Moin Moin destroys itself when out of disk. :-)
* Fix syslogging of errors.
* Add a "conditional" plugin, which allows displaying text if a condition
is true. It is enabled by default so conditional can be used in the
basewiki.
* Use conditionals in the template for plugins, so that plugin pages
say if they're currently enabled or not, and in various other places
in the wiki.
non-page format files in the wiki. To exploit this, the file already had
to exist in the wiki, and the web user would need to somehow use the web
based editor to replace it with malicious content.
(Sorry Josh, this means you can't edit style.css directly anymore,
although I do appreciate your fixes, actually..)
edited.
* Move code forcing signing before edit to a new "signinedit" plugin, and
code checking for locked pages into a new "lockedit" plugin. Both are
enabled by default.
* Remove the anonok config setting. This is now implemented by a new
"anonok" plugin. Anyone with a wiki allowing anonymous edits should
change their configs to enable this new plugin.
* Add an opendiscussion plugin that allows anonymous users to edit
discussion pages, on a wiki that is otherwise wouldn't allow it.
* Lots of CGI code reorg and cleanup.