The protection against processing loops (i.e. the alreadyfiltered stuff) was
playing against us: the template plugin triggered a filter hooks run with the
very same ($page, $destpage) arguments pair that we use to identify a already
filtered page. Processing an included template could then mark the whole
translation page as already filtered, which prevented po_to_markup to be called
on the PO content.
This commit only runs the whole PO filter logic when our filter hook is run by
IkiWiki::render, which only happens when the full page needs to be filtered.
Renamed usershort => nickname.
Note that this means existing user login sessions will not have the nickname
recorded, and so it won't be used for those.
There was some confusion about whether the filename was
relative to srcdir or not. Some test cases, and the bzr
plugin assumed it was relative to the srcdir. Most everything else
assumed it was absolute.
Changed it to relative, for consistency with the rest
of the rcs_ functions.
Using named parameters for these is overdue. Passing the session in a
parameter instead of passing username and IP separately will later allow
storing other session info, like username or part of the email.
Note that these functions are not part of the exported API,
and the prototype change will catch (most) skew, so I am not changing
API versions. Any third-party plugins that call them will need updated
though.
In the process, lost the commits from special usernames
when committing changed po files. Instead of trying to dummy up a session
object for the special username, I just don't pass one, and the commit will
appear to be from whatever user ikiwiki runs as.
Everywhere that REMOTE_ADDR was used, a session object is available, so
instead use its remote_addr method.
In IkiWiki::Receive, stop setting a dummy REMOTE_ADDR.
Note that it's possible for a session cookie to be obtained using one IP
address, and then used from another IP. In this case, the first IP will now
be used. I think that should be ok.
Now the git plugin supports commits with author fields that look like:
Author: http://my.openid/ <me@web>
Then in recentchanges, the short username will be displayed, linking
to the openid.
Particularly useful for the horrible google openids, of course.
This way, an email-like link will be a mailto until a matching page
is created, then it will link to the page. And removing the page will
convert it back to a mailto.
At least two bugfixes in here. First, an old bug;
\[[foo#0]] was displayed as [[foo]], losing the anchor
as the anchor text was false. Secondly, a new bug;
an email like foo#bar@baz should not check bestlink("foo@baz").
The following ways to create a link are supported now:
[[url]]
[[text|url]]
url can be one of the following:
- an internal wikilink: will be handled as before
- any other kind of URL, including mailto: proper links will be created:
<a href="url">url</a>
<a href="url">text</a>
- an email address:
<a href="mailto:url">url</a>
<a href="mailto:url">text</a>
For now, a rebuild is the only way to ensure the changed theme is used.
Ikiwiki normally will not realize style.css has changed, since themes
tend to have the same timestamp for the file.
A short story:
Once there was a unicode string, let's call him Srcdir.
Along came a crufy old File::Find, who went through a tree and pasted each
of the leaves in turn onto Srcdir. But this 90's relic didn't decode the
leaves -- despite some of them using unicode! Poor Srcdir, with these
leaves stuck on him, tainted them with his nice unicode-ness. They didn't
look like leaves at all, but instead garbage.
(In other words, perl's unicode support sucks mightily, and drives
us all to drink and bad storytelling. But we knew that..)
So, srcdir is not normally flagged as unicode, because typically it's pure
ascii. And in that case, things work ok; File::Find finds filenames, which
are not yet decoded to unicode, and appends them to the srcdir, and then
decode_utf8 happily converts the whole thing.
But, if the srcdir does contain utf8 characters, that breaks. Or, if a Yaml
setup file is used, Yaml::Syck's implicitunicode sets the unicode flag of
*all* strings, even those containing only ascii. In either case, srcdir
has the unicode flag set; a non-decoded filename is appended, and the flag
remains set; and decode_utf8 sees the flag and does *nothing*. The result
is that the filename is not decoded, so looks valid and gets skipped.
File::Find only sticks the directory and filenames together in no_chdir
mode .. but we need that mode for security. In order to retain the
security, and avoid the problem, I made it not pass srcdir to File::Find.
Instead, chdir to the srcdir, and pass ".". Since "." is ascii, the problem
is avoided.
Note that chdir srcdir is safe because we check for symlinks in the srcdir
path.
Note that it takes care to chdir back to the starting location. Because
the user may have specified relative paths and so staying in the srcdir
might break. A relative path could even be specifed for an underlay dir, so
it chdirs back after each.
Removing a plugin from add_plugins is not always enough to disable it.
It may have been redundantly added there and also pulled in via goodstuff.
Always add didabled plugins to disable_plugins.
* calendar: Shorten day names, and improve styling of month calendar.
* style.css: Reduced sidebar width back to 20ex from 30; the month calendar
will now fit in the smaller width, and 30 was feeling too large.
I've seen user(http://*) confuse someone who didn't know pagespecs to think
that just http://* would moderate all comments to every page, or something
like that.
Problem is that by the time rendering calls render_dependent, %pagesources
has had deleted files removed from it. So match_comment's lookup of
files in there to see if they had the _comment extension failed.
I had to introduce a hash that temporarily holds filenames of deleted pages
to fix this.
Note that unlike comment(), internal() had avoided this pitfall by being
defined to match both internal and non-internal pages.
If the site is configured to allow comments on *, then the comment post
interface was being added to cgi pages like signin and prefs. This fixes it
w/o requiring more page.tmpl changes. The pagetemplate hook is called by
misctemplate with an empty page name for dynamic pages.
On second thought, misctemplate can use pagetemplate hooks to provide
it, so it's better to keep back-compat, and allow full customisation
of how it's displayed via the template.
So RecentChanges shows on the action bar there,
convert recentchanges to use new pageactions hook,
with compatability code to avoid breaking old templates.
If po is imported twice, bad things happen. Guard against that.
I'm not sure what causes the double import; I saw it when websetup did a
wiki rebuild. Carp failed to show a backtrace for the second call to
import.
* openid: Incorporated a fancy openid-selector signin form.
(http://code.google.com/p/openid-selector/)
* openid: Use "openid_identifier" as the form field, as required
by OpenID Authentication v2.0 spec.
Instead, add a custom do=commentsignin, that calls cgi_signin.
This allows a plugin to inject a custom cgi_signin, that uses a different
do= parameter, and have it be used consitently. (This was the only
place to hardcode a link to do=signin.)
test isinternal first, because match_glob with internal => 1 also returns
non-internal pages that match. This order should also be faster.
Remove test to see if pagesources is set. isinternal will not succeed if it
is not.
Since all forms are wrapped in a template that defines the actual
stylesheets, formbuilder just has to be told to turn on stylesheet mode,
not what file is the style sheet.