There are two sub-caces. If both source files still exist, the winner that
renders the destination file is undefined. If one source file is deleted
and the other added, in a refresh, the new file will take over the
destination file.
Using named parameters for these is overdue. Passing the session in a
parameter instead of passing username and IP separately will later allow
storing other session info, like username or part of the email.
Note that these functions are not part of the exported API,
and the prototype change will catch (most) skew, so I am not changing
API versions. Any third-party plugins that call them will need updated
though.
* openid: Incorporated a fancy openid-selector signin form.
(http://code.google.com/p/openid-selector/)
* openid: Use "openid_identifier" as the form field, as required
by OpenID Authentication v2.0 spec.
Many calls to file_prune were incorrectly calling it with 2 parameters.
In cases where the filename being checked is relative to the srcdir,
that is not needed.
Made absolute filenames be pruned. (This won't work for the 2 parameter call
style.)
This can be a lot faster, since huge numbers of pages are not sorted
only to mostly be thrown away. It sped up a build of my blog by at least
5 minutes.
The hash will be used used to record a set of pages that influenced the
result of a pagespec match.
The influences are merged together when boolean and/or are encountered
in a pagespec. That means using a non-short-circuiting OR operator. And
so I use & and | when translating pagespecs, since those bitwise operators
can be overloaded. ("and" and "or" cannot, apparently).
calls are warranted. They shouldn't modify the caller's working directory,
though. Use File::chdir to keep the scope of the changes subroutine-local.
The tests now pass without resetting the working directory.
Now that dependencies are a list of pagespecs with an implicit "or"
operation, there's no need to try to merge pagespecs under normal use.
ikiwiki-transition contains the only use of the function, so move
it there rather than deleting it entirely (it's used to concatenate all
admins' lists of locked pages).
On a large wiki you can spend a lot of time reading through large lists
of dependencies to see whether files need to be rebuilt (album, with its
one-page-per-photo arrangement, suffers particularly badly from this).
The dependency list is currently a single pagespec, but it's not used like
a normal pagespec - in practice, it's a list of pagespecs joined with the
"or" operator.
Accordingly, change it to be stored as a list of pagespecs. On a wiki
with many tagged photo albums, this reduces the time to refresh after
`touch tags/*.mdwn` from about 31 to 25 seconds.
Getting the benefit of this change on an existing wiki requires a rebuild.
The test suite was emitting a lot of ugly gettext warnings;
setting LC_ALL didn't solve the problem for all locale setups
(since ikiwiki remaps it to LANG, and ikiwiki didn't know about
the C locale).
People also seem generally annoyed by the messages when
Locale::Gettext is not installed, and I suspect will be
generally happier if it just silently doesn't localize.
The optimisation came about when I noticed that the gettext
sub was doing rather a lot of work each call just to see
if localisation is needed. We can avoid that work by caching,
and the best thing to cache is a version of the gettext sub
that does exactly the right thing.
This was slightly complicated by the locale setting,
which might need to override the original locale (or lack
thereof) after gettext has been called. So it needs to invalidate
the cache in that case. It used to do it via a global variable,
which I am happy to have also gotten rid of.
A directive that contains an unterminated """ string should not
cause each word of the string to be treated as a bare word. Instead,
the directive should fail to parse.
There are two tests. One just checks that a complete directive
containing such a string fails to parse. The other checks for a case
where a directive ends with a very long unterminated """ string,
and the directive is itself not closed. While this test won't fail,
it does trigger a nasty perl warning.
See bug #411786. Perl's random corruption of the taint flag is even effecting
the untainting of source filenames now (which AFAICS, is a proper untaint
and always worked before..), and that makes using ikiwiki in perl taint
mode not work at all.
And avoid a whole class of potential security problems (though
none that I know of actually existing..), by avoiding
performing any string interpolation on user-supplied data when translating
pagespecs.
This works around an enormous (and, in this context, enormously confusing)
message that git has begun to print when one attempts to push changes into
a non-bare repo.
As a bonus, it now tests whether ikiwiki-makerepo works.
This may already work with other web servers that have copied apache's
interface, and it should be easy to add support to it for web servers that
use some other interface. So, make the name more general.
It seems to be a failing of i18n in unix that the translation stops at the
commands and the parameters to them, and ikiwiki is no exception with its
currently untranslated directives. So the little bit that's translated sticks
out like a sore thumb. It also breaks building of wikis if a different locale
happens to be set.
I suppose the best thing to do is either give up on the localisation of this
part completly, or make it recognise English in addition to the locale. I've
tenatively chosen the latter.
(Also accept 1 and 0 as input.)
This fixes a problem exposed by the recent change to tags
(a2839de936). That recorded tag links as
absolute by including a leading slash in the link. The same could also be
done with an absolute wikilink.
In either case, link() would not match such links, unless the leading slash
was included in the link to match. But that's not right, because pagespecs
match absolute by default. So strip the leading slash.
Note that to keep any existing `link(/foo)` pagespecs working after this
change, the leading slash is removed from there, too.
I want to have an easy way to know if I break something when I'll convert custom
added hooks to the new "inject" feature. It will also be useful after this
conversion, to trigger an alert when IkiWiki's internals change enough to break
my wrapper functions.
Signed-off-by: intrigeri <intrigeri@boum.org>
Have to convert link text to page name going in.
And on the way out, need to replace spaces with underscores in the link
text, which is not normally done with titles.
* Renamed to parentlinks every single variable or function called
pedigree
* Removed the parentlinks function from Render.pm
* Enabled the new parentlinks plugin by default
* Adapted testsuite and documentation to reflate the above facts
Signed-off-by: intrigeri <intrigeri@boum.org>
Because the search plugin needed it, also because it's one of the few
plugins that didn't already have it.
I also considered adding it to htmlize, but I really cannot imagine caring
what the destpage is when htmlizing. (I'll probably be poven wrong later.)
This manifested as wikis with no locked pages treating them all as locked.
The bug was introduced in version 2.41.
Medium urgency upload due to above fix.
I kept it to a simple global configuration, rather than using the
preprocessor directive for recentchanges, because that had chicken and egg
problems and seemed overcomplicated. This should work reasonably well,
though it would be good to add some more metadata so that more customised
recentchanges pages can be made.
Add a prefix_directives option to the setup file to turn this syntax
on; currently defaults to false, for backward compatibility. Support
optional '!' prefix even with prefix_directives off, and use that in
the underlay to support either setting of prefix_directives. Add NEWS
entry with migration information.
* Plugins can add new directories to the search path with the add_underlay
function.
* Split out smiley underlay files into a separate underlay, so if the plugin
isn't used, the wiki isn't bloated with all those files.
* Support building on systems that lack asprintf.
* mercurial getctime is currently broken, apparently by some change in
mercurial version 0.9.4. Turn the failing test case into a TODO test case.
(Get a good message when a PageSpec fails due to a negated success by
creating success objects with a reason string, which morph into failure
objects when negated.)
scalar context, evaluates to a reason why the match failed.
* Add testpagespec plugin, which might be useful to see why a pagespec isn't
matching something.
for extended pagespecs. The old calling convention will still work for
back-compat for now.
* The calling convention for functions in the IkiWiki::PageSpec namespace
has changed so they are passed named parameters.
* Plugin interface version increased to 2.00 since I don't anticipate any
more interface changes before 2.0.
that given link points based on the page doing the linking. Note that this
could make such PageSpecs match different things than before, if you
relied on the old behavior of them only matching the raw link text.
* This required changing the match_* interface, adding a third parameter.
* Allow link() PageSpecs to match relative, as is allowed with globs.a
* Add postform option to inline plugin.
* Add an bug tracker to the softwaresite example.
parameters remain the same, but additional options are now passed in using
named parameters.
* Change plugin interface version to 1.02 to reflect this change.
* Add a new anchor option to htmllink. Thanks Ben for the idea.
* Support anchors in wikilinks.
* Add a "more" plugin based on one contributed by Ben to allow implementing
those dreaded "Read more" links in blogs.
* Add a "conditional" plugin, which allows displaying text if a condition
is true. It is enabled by default so conditional can be used in the
basewiki.
* Use conditionals in the template for plugins, so that plugin pages
say if they're currently enabled or not, and in various other places
in the wiki.
* Add a test suite for the svn backend.
* Daemonize before sending RPC pings, since that can take a while
and/or hang.
* Daemonize before sending commit mails, as that can also take a long
time/hang if the mail server is unhappy.
* Factor out commit mail sending code into new function.
* Add some code to the build system that tries to determine if the
lib installation directory is in @INC. If it's not, munge ikiwiki
to hardcode the path to the lib directory. This should allow installing
ikiwiki in nonstandard locations, including home directories, by just
setting PREFIX at build time.
* Fix nested examples directory in deb.
source file, to allow tracking of extra rendered files like rss feeds.
* Note that plugins that accessed this variable will need to be updated!
The plugin interface has been increased to version 1.01 for this change.
* Add will_render function to the plugin interface, used to register that a
page renders a destination file, and do some security checks.
* Use will_render in the inline and linkmap plugins.
* Previously but no longer rendered files will be cleaned up.
* You will need to rebuild your wiki on upgrade to this version.
- Plugins should not need to load IkiWiki::Render to get commonly
used functions, so moved some functions from there to IkiWiki.
- Picked out the set of functions and variables that most plugins
use, documented them, and made IkiWiki export them by default,
like a proper perl module should.
- Use the other functions at your own risk.
- This is not quite complete, I still have to decide whether to
export some other things.
* Changed all plugins included in ikiwiki to not use "IkiWiki::" when
referring to stuff now exported by the IkiWiki module.
* Anyone with a third-party ikiwiki plugin is strongly enrouraged
to make like changes to it and avoid use of non-exported symboles from
"IkiWiki::".
* Link debian/changelog and debian/news to NEWS and CHANGELOG.
* Support hyperestradier version 1.4.2, which adds a new required phraseform
setting.
* Add --version.
* Man page format fixups.
* Add a %pagecase which maps lower-case page names to the actual case
used in the filename. Use this in bestlinks calculation instead of
forcing the link to lowercase.
* Also use %pagecase in various other places that want to check if a page
with a given name exists.
* This means that links to pages with mixed case names will now work,
even if the link is in some other case mixture, and mixed case pages
should be fully supported throughout ikiwiki.
* Recommend rebuilding wikis on upgrade to this version.
* PageSpecs can now include nested parens, "and", and "or". This remains
backwards compatible to the old GlobList format. It's implemented by
treating the GlobList as a very limited microlanguage that is transformed
to perl code that does the matching.
* The old GlobList format is deprecated, and I encourage users to switch to
using the new PageSpec format. Compatability with the old format will be
removed at some point, possibly by 2.0.
* Wiki rebuild needed on upgrade to this version due to PageSpec change.
* Add support for creation_month and creation_year to PageSpec.
Closes: #380680
* Changes to index file encoding.
flagged string even if the locale causes it to generate utf8 output,
so make sure to let perl know it should be handled as utf8. Also,
the optimised version used for standard time formats won't work if the
user has changed locale, so drop it. Thanks, Faidon Liambotis.
* Fix re-encoding of the comments field to utf8 if a commit fails
due to a conflict. Thanks, Faidon Liambotis.
* Let svn know that commits have utf8 commit messages. Thanks, Faidon
Liambotis.
* Add insane double encode/decode to utf8 around call to markdown.
This works around a truely strange bug, which is apparently a bug in
perl, which I lack space to describe here (see t/crazy-badass-perl-bug.t)
layer, which led to lots of problems; make it force read files as utf-8.
Closes: #373203
* writefile() likewise needs to use the utf8 output layer.
* Remove the -CSD from ikiwiki's hashbang since it's useless to have it
there.
* Revert some of the decode_utf8 changes in CGI.pm that seem unnecessary
given the readfile fix.
* Add utf-8 testcases for readfile and htmlize.
* When inlining a page in another one, links from the inlined page are now
expanded the same as they are when rendering the inlined page as a
standalone page. So rather than being expanded from the POV of the
inlining page, they are expanded from the POV of the inlined page.
For example, a link from blog/foo to "bar" will now link to blog/bar
if it exists. Previously this needed to be a link explicitly to
"blog/bar"; such links will also continue to work.
(This was slightly complex to do as the link still has to be constructed
relative to the inlining page.)
* Add a html validity check to the test suite, using the wdg-html-validator,
if available.
* Make the html valid when there is nothing in the actions list by adding an
empty <li> to the end of it.
* Reordered some function call parameters for consistency.