Implemented for git and svn so far.
Note that rcs_commit_staged does assume that the rcs has the ability to
"stage" multiple changes for a later commit. Support for this varies, but
all we really care about is staging removals and renames, which, AFAIK, all
modern rcs's support.
can be used to avoid a security check that is a good safe default, but
problimatic overkill in some situations.
I decided to underdocument this, because the option looks ugly, and I don't
want people randomly turning it on because it looks like a good idea. So if
you need it, you'll get an error message mentioning how to fix it.
The committer's email address is not used (because leaking email addresses
is not liked by many users). Closes: #451023
A "Web-commit" trailer is added, to allow telling the difference between
web commits and direct commits.
Smileys need to be double-escaped to work, since the smiley plugin runs as
a sanitize hook, and markdown helpfully removes one level of escapes first.
There were some bugs in the smiley handling code that made escaped smileys
still be expanded. After unescaping a smiley, it needed to move pos forward
past it or the next pass would expand it.
Also, once the m//g got to the end, it seemed to loop back through and make
one more pass (a difference in perl 5.10's regexp exngine? I observed that
pos was undefined when this happened, so added a `last unless defined pos`.
This is truely horribly disgusting. CGI::tmpFileName, in current perls, is
an undocumented function (which should be a clue..) that takes the original
filename of an uploaded attachment, and returns the name of the tempfile
that CGI has stored it in.
In old perls, though, CGI::tmpFileName does not take a filename. It takes
a key from the object's {'.tmpfiles'} hash. This key is something
crazy like '*Fh::fh00001group' -- apparently the stringification of a
filehandle object.
Just to add to the fun, tmpFileName doesn't take the key, it expects a
refernce to the key. Argh?!
But the fun doesn't stop there, because in perl 5.8, CGI.pm is also broken
in two other ways. The upload() method is supposed to return a filehandle
to the temp file. It doesn't. The param() method is supposed to return
a filehandle to the temp file, that stringifies to the original filename.
It returns just the original filename, no filehandle.
Combine all these bugs, and you end up with this disgusting commit. Since
I have no way to get the filehandle, I *need* to get the tempfile name.
If I had the filehandle, I could probably pass it into tmpFileName, and
it might strigify to the right key name. But I don't, so the only way to
determine the key is to grub through the .tmpfiles hash ourselves.
And finally, one the temp file name is discovered, a filehandle can finally
be obtained by (re)opening it.
I recommend that this commit be reverted when perl 5.8 is a mercifully
faded memory.
I'm really, really, really glad I'm actually being paid for working on
this right now!
So the problem is that ikiwiki would generate a relative link like
href="colon:problem", which web browsers treat as being in the "colon:"
uri scheme.
The best fix seems to be to make url beautification fix this, by slapping
a "./" in front.
* The editpage form now uses the raw page name, not the page title, in its
'page' cgi parameter. Using the title was ambiguous and made it
impossible to tell between some pages, like "foo/bar" and "foo__47__bar",
sometimes causing the wrong page to be edited.
* This change means that some edit links need to be updated.
Force a rebuild on upgrade to this version.
* Above change also allowed really fixing escaped slashes from the blogpost
form.
* toc: Revert change in 2.45 that made it run at sanitize time. This breaks
use of toc in a sidebar.
* Call format hooks when generating page previews, thus fixing toc display
there, as well as fixing inlins to again display in page previews, since
it's started using format hooks. This also allows several other things,
like embed, that use format hooks, to work during page preview time.
* Format hooks should not rely on getting an entire html document, as they
will only get the body during page preview.
* toggle: Deal with preview mode when adding javascript.
This special case crops up when generating the parentlink to the toplevel
index page. urlto("") had been generating a link to "./" (or "../" etc)
for that, which is fine, if the web server redirects that to the toplevel
index.html. It's less fine if there is no web server.
I actually ran into the problem first when using gopher. (Yes, yes, don't
laugh.. see upcoming tip.) But it also crops up when browsing local wiki
files.
Of course, the index.html is stripped back off if usedirs is enabled.
Because the search plugin needed it, also because it's one of the few
plugins that didn't already have it.
I also considered adding it to htmlize, but I really cannot imagine caring
what the destpage is when htmlizing. (I'll probably be poven wrong later.)
This fixes a problem sgran saw on alioth. Apparently nss-db sets errno to
ENOENT as a side effect trying to read an optional file, but succeeds
anyway. Then, somehow, errno remains set across the library calls made by
$).
So unset it first as a workaround; there's probably a nss-db, libc, and/or
perl bug underneath.
Something has changed in CGI.pm in perl 5.10. It used to not care
if STDIN was opened using :utf8, but now it'll mis-encode utf-8 values
when used that way by ikiwiki. Now I have to binmode(STDIN) before
instantiating the CGI object.
In 57bba4dac1, I changed from decoding
CGI::Formbuilder fields to utf-8, to decoding cgi parameters before setting
up the form object. As of perl 5.10, that approach no longer has any effect
(reason unknown). To get correctly encoded values in FormBuilder forms,
they must once again be decoded after the form is set up.
As noted in 57bba4da, this can cause one set of problems for
formbuilder_setup hooks if decode_form_utf8 is called before the hooks, and
a different set if it's called after. To avoid both sets of problems, call
it both before and after. (Only remaining problem is the sheer ugliness and
inefficiency of that..)
I think that these changes will also work with older perl versions, but I
haven't checked.
Also, in the case of the poll plugin, the cgi parameter needs to be
explcitly decoded before it is used to handle utf-8 values. (This may have
always been broken, not sure if it's related to perl 5.10 or not.)
* Add a Bundle::Ikiwiki to the source for use with CPAN to install *all*
the modules ikiwiki can use.
* Add a cpan directory containing a CPAN::MyConfig that can ease use of
CPAN to install in a home directory on shared hosting providers.
* With these changes, it's pretty easy to install onto nearlyfreespeech.net
and probably other shared hosting providers like dreamhost. Added
a tip page documentng the process for nearlyfreespeech.
This manifested as wikis with no locked pages treating them all as locked.
The bug was introduced in version 2.41.
Medium urgency upload due to above fix.
The fix involved embedding the session id in the forms, and not allowing the
forms to be submitted if the embedded id does not match the session id.
In the case of the preferences form, if the session id is not embedded,
then the CGI parameters are cleared. This avoids a secondary attack where the
link to the preferences form prefills password or other fields, and
the user hits "submit" without noticing these prefilled values.
In the case of the editpage form, the anonok plugin can allow anyone to edit,
and so I chose not to guard against CSRF attacks against users who are not
logged in. Otherwise, it also embeds the session id and checks it.
For page editing, I assume that the user will notice if content or commit
message is changed because of CGI parameters, and won't blndly hit save page.
So I didn't block those CGI paramters. (It's even possible to use those CGI
parameters, for good, not for evil, I guess..)
The only other CSRF attack I can think of in ikiwiki involves the poll plugin.
It's certianly possible to set up a link that causes the user to unknowingly
vote in a poll. However, the poll plugin is not intended to be used for things
that people would want to attack, since anyone can after all edit the poll page
and fill in any values they like. So this "attack" is ignorable.
During refresh of a wiki with 800 files, loadindex was using more total
time than any other function, and saveindex was also in the top ten.
Rewriting them to use Storable makes them three times as fast.
0.7 seconds is saved on my laptop in profiling mode.
About 12% of ikiwiki runtime was spent in pagespec_match. It was evaling
the same pagespec code over and over again. This changes pagespec_translate
to return memoized, precompiled functions that can be called to match against
a given pagespec.
This also allows getting rid of the weird variable scoping trick that had
to be in effect for pagespec_translate to be called -- the variables are
now just fed into the function it returns.
On my laptop, this drops build time for the docwiki from about 60 to 50
seconds.
Markdown is slow. Especially if it has to process an enormous page. The
most common enormous page is currently the recentchanges page, which gets
processed a lot, and contains very little actual markdown. Most of it is a
big <div>, which markdown skips ... slowly.
This is a rather sick optimisation to work around markdown's speed issues.
Now inline inserts a small, dummy div, allows markdown to quickly render
the actual page content, then replaces the dummy with the actual inlined
pages later.
Results: Rendering just a recentchanges page, with diffs included, dropped
from 4.5 seconds to 2.7 seconds on my laptop. Building the entire wiki
dropped from 46.6 seconds to 39.5 seconds.
(It would be better if inline were a *post*-processor directive.)
set to the destination page. This avoids need for hacks to munge the urls
in preview mode, which fixes several bugs.
* Several destpage fixes in plugins.
correct, and it's certianly not correct now, since the wiki is locked
before rcs_commit is ever called, and should not be unlocked by
rcs_commit either.
Markdown is such a splintered mess.. The current debian package provides
only Text::Markdown::Markdown, while all versions of Text::Markdown support
Text::Markdown::markdown, and old versions also support the capitalised version,
while new ones don't.
It's getting to the point where `grep /markdown/i %symbol_table` is the only
sane way to figure out what function to call..
* rcs_diff is a new function that rcs modules should implement.
* Implemented rcs_diff for git, svn, and tla (tla version untested).
Mercurial and monotone still todo.
Add special handling for <meta name="robots" ...> which needs not be
scrubbed as it's harmless.
Signed-off-by: martin f. krafft <madduck@madduck.net>
(cherry picked from commit b15d0299a7f7b147e89d8a202d6cca1c21491af2)
As was already done for linkfication, links generated in a prevew page
are relative to the top of the wiki, so it has to be told that the destpage
is there.
I was using "" to indicate this, but that may confuse some preprocessor
plugins, which treat parameters with an empry value specially (sparkline is one
such). Instead, use "/", which is more accurate anyway and works just as well.
which forced a scan of the page to make available metadata that
appeared after the inline directive. Problem is that scan made it forget
about any other files rendered due to the page. The scan also turns out
to be unnecessary now, since meta persistently stores state and it's
always available. So it was just removed.
(as preserving the full list across preview would be tricky). Userdirs
were still being offered as an option there, remove them.
* Fix a bug where user A created a page concurrently with user B, and
when B previewed it would redirect B to A's new page, losing B's work.
Instead, don't redirect and let conflict handling resolve it.
containing ikiwiki.cgi, but this should not change the urls to the style
sheets etc. Add a new forcebareurl parameter to misctemplate to allow
it to do that.
of XML::RPC's default of us-ascii. Allows interoperation with
python's xmlrpc library, which threw invalid encoding exceptions and
caused the rst plugin to hang.
Some browsers interpret about: URIs like a limited version of data:
URIs. In particular, some versions of Internet Explorer interpret
arbitrary HTML content in about: URIs.
just avoid actually writing the files. This is necessary because ikiwiki
saves state after a preview (in case it actually *did* write files),
and if will_render isn't called its security checks will get upset
when the page is saved. Thanks to Edward Betts for his help tracking this
tricky bug down.
- On commits, replace "mtn sync" bidirectional with "mtn push" single
direction. No need to pull changes when doing a commit. mtn sync
is still called in rcs_update.
- Support for viewing differences via patches using viewmtn.
Now aggregation will not lock the wiki. Any changes made during aggregaton are
merged in with the changed state accumulated while aggregating. A separate
lock file prevents multiple concurrent aggregators. Garbage collection
of orphaned guids is much improved. loadstate() is only called once
per process, so tricky support for reloading wiki state is not needed.
(Tested fairly thuroughly.)
since this leads to too many problems with web caching, especially with
inlined pages. Properly solving this would involve tracking every page
that contributes to a page's content and using the youngest of them all,
as well as special cases for things like the version plugin, and it's just
too complex to do.
license, and copyright. This can be used to create custom RecentChanges.
* meta: To support the pagespec functions, metadata about pages has to be
retained as pagestate.
* Fix encoding bug when pagestate values contained spaces.
This makes it a lot quicker to deal with lots of recentchanges pages
appearing and disappearing. It avoids needing to clutter up pagespecs with
exclusions for those pages, by making normal pagespecs not match them.
Add a prefix_directives option to the setup file to turn this syntax
on; currently defaults to false, for backward compatibility. Support
optional '!' prefix even with prefix_directives off, and use that in
the underlay to support either setting of prefix_directives. Add NEWS
entry with migration information.
the code, since that process can change internal state as needed, and
it will automatically be cleaned up for the parent process, which proceeds
to render the changes.