In test, set up the post-commit hook for more realism (and bugs!).
To make wrappers work in test, set PERL5LIB, and allow the wrappee's
path to be overridden. Meta-test that post-commit is really hooked
up by verifying that content is getting generated in destdir.
About the longstanding bug, which as far as I know was harmless:
CVS can't operate outside a srcdir, so we're always setting $CWD.
"local $CWD" restores the previous value when we go out of scope.
Usually that's correct. But if we're removing the last file from a
directory, the post-commit hook will exec in a working directory
that's about to not exist (CVS will prune it).
The fix: chdir() manually in cvs_runcvs(), so we can selectively
not chdir() back.
If the title of a trail changes, each member of that trail must be
rebuilt, for its prev/up/next box to reflect the new title.
If the title of a member changes, its next and previous items (if any)
must be rebuilt, for their prev/up/next boxes to reflect the new title.
Previously, prune("wiki/srcdir/sandbox/test.mdwn") could delete srcdir
or even wiki, if they happened to be empty. This is rarely what you
want: there's usually some base directory (destdir, srcdir, transientdir
or another subdirectory of wikistatedir) beyond which you do not want to
delete.
This ensures that when we do the second phase of the test (edit some
files and refresh), the changes get a different mtime and are picked up,
even if the entire test happened between two 1-second "clock ticks".
* Test that adding a text file under a name formerly tracked as
binary (and vice versa) gets the right keyword-substitution
behavior.
* Explicitly set -kkv for text files to make the tests pass.
* CVS warns in these cases about "changing keyword expansion mode",
but this is correct behavior, so filter it from stderr. Filter
stdout the same way in case we ever want to keep any of it.
* In rcs_add(), replace comments with obviousness.
a bunch more tests (that wind up exercising rcs_commit(),
rcs_commit_staged(), and rcs_recentchanges()). Extract some support
routines for brevity. Most is_in_keyword_substitution_mode() tests
are commented out because there's a bug -- non-binary files are
being added with "cvs add -kb".
Move tests that inspect recentchanges after direct CVS operations
into test_rcs_recentchanges().
In the code:
* general plugin API calls (in plugins/write order),
* VCS plugin API calls (in plugins/write order), then
* internal support routines (in alphabetical order).
In the tests:
* general meta-behavior (in no particular order, yet),
* general plugin API calls (in plugins/write order),
* VCS plugin API calls (in plugins/write order), then
* internal support routines (in semi-logical order).
* Add setup and teardown methods, called before and after every test sub.
* In setup, make a fresh repo; in teardown, throw it out.
* Extract runtests method and define default test methods at top.
* Move reflection routines near the xUnit-style subs they support.
Adapt existing test subs to run independently:
* In test_manual_add_and_commit(), assume a fresh repo.
While here, plan a bit better:
* Check for all modules used by cvs.pm.
* Check for program existence more generally.
* Check that we can rmdir after mkdir.
* Run all subs matching /^test_*/ (for which we can plan)...
* Unless TEST_METHOD is set, in which case run matching subs (sans plan).
* Define total number of tests very near 'use Test::More', where expected.
* Define test tempdir where it's declared, no longer any reason why not.
* Move most comments from TODO.cvs into t/cvs.t.
* Add a whole bunch more comments describing the needed test cases.
XXX existing tests are order-dependent, but currently happen to pass
* Call readfile() directly from writefile().
* Parameterize commit message for the web-commit case.
* Describe intent of test cases.
* Rename test subs to match what they actually do.
* To prove extra path slashes don't cause trouble, instead of running
the same tests a second time, just assert that checkconfig()
strips the slashes.
the test plan at runtime. Use IkiWiki unconditionally too (as that's
not what I'm testing here) to avoid the TAP error of printing a
test result before having printed the plan.
In the first test, discount returns the html attributes in a different
order, which broke the test. Test only for the important text, not the
exact html output.
In the second test, discount does some encoding of its own of the partially
encoded url, again resulting in different output.
This is such a pity. smcv had these great dates, but squeeze's Date::Parse
cannot parse them.
Oh well, at least it makes for a great bug closure title.
- Migrate the set of deletions to the {autofile} set, since it has
more or less the same effect. This affects the "deleted" case in the
test.
- If a page has just been deleted, add it as an autofile anyway: by
the time gen_autofile is called, it'll be in the list of deleted files,
so it'll just be added to {autofile}. This affects the "gone" case
in the test.
- Behaviour change: we don't forget that a page with no reason to be
re-created was deleted. This affects the 'expunged' and 'reinstated'
cases in the test.
This does cause a minor regression: index pages are now committed
individually rather than being a single commit per rebuild.
This also means the autoindex regression test needs to trigger the
autofile generation pass.
As index.{es,fr} don't exist, po::refreshpofiles copies them from the basewiki
underlay before running msgmerge. msgmerge marks as obsolete the translation
strings that came from the basewiki po files, but the link plugin
does not make the difference between obsolete and up-to-date links.
$links{'index.fr'} and $links{'index.es'} are therefore expected to contain
SandBox and ikiwiki.
There are two sub-caces. If both source files still exist, the winner that
renders the destination file is undefined. If one source file is deleted
and the other added, in a refresh, the new file will take over the
destination file.
Using named parameters for these is overdue. Passing the session in a
parameter instead of passing username and IP separately will later allow
storing other session info, like username or part of the email.
Note that these functions are not part of the exported API,
and the prototype change will catch (most) skew, so I am not changing
API versions. Any third-party plugins that call them will need updated
though.
* openid: Incorporated a fancy openid-selector signin form.
(http://code.google.com/p/openid-selector/)
* openid: Use "openid_identifier" as the form field, as required
by OpenID Authentication v2.0 spec.
Many calls to file_prune were incorrectly calling it with 2 parameters.
In cases where the filename being checked is relative to the srcdir,
that is not needed.
Made absolute filenames be pruned. (This won't work for the 2 parameter call
style.)
This can be a lot faster, since huge numbers of pages are not sorted
only to mostly be thrown away. It sped up a build of my blog by at least
5 minutes.
The hash will be used used to record a set of pages that influenced the
result of a pagespec match.
The influences are merged together when boolean and/or are encountered
in a pagespec. That means using a non-short-circuiting OR operator. And
so I use & and | when translating pagespecs, since those bitwise operators
can be overloaded. ("and" and "or" cannot, apparently).
calls are warranted. They shouldn't modify the caller's working directory,
though. Use File::chdir to keep the scope of the changes subroutine-local.
The tests now pass without resetting the working directory.
Now that dependencies are a list of pagespecs with an implicit "or"
operation, there's no need to try to merge pagespecs under normal use.
ikiwiki-transition contains the only use of the function, so move
it there rather than deleting it entirely (it's used to concatenate all
admins' lists of locked pages).
On a large wiki you can spend a lot of time reading through large lists
of dependencies to see whether files need to be rebuilt (album, with its
one-page-per-photo arrangement, suffers particularly badly from this).
The dependency list is currently a single pagespec, but it's not used like
a normal pagespec - in practice, it's a list of pagespecs joined with the
"or" operator.
Accordingly, change it to be stored as a list of pagespecs. On a wiki
with many tagged photo albums, this reduces the time to refresh after
`touch tags/*.mdwn` from about 31 to 25 seconds.
Getting the benefit of this change on an existing wiki requires a rebuild.
The test suite was emitting a lot of ugly gettext warnings;
setting LC_ALL didn't solve the problem for all locale setups
(since ikiwiki remaps it to LANG, and ikiwiki didn't know about
the C locale).
People also seem generally annoyed by the messages when
Locale::Gettext is not installed, and I suspect will be
generally happier if it just silently doesn't localize.
The optimisation came about when I noticed that the gettext
sub was doing rather a lot of work each call just to see
if localisation is needed. We can avoid that work by caching,
and the best thing to cache is a version of the gettext sub
that does exactly the right thing.
This was slightly complicated by the locale setting,
which might need to override the original locale (or lack
thereof) after gettext has been called. So it needs to invalidate
the cache in that case. It used to do it via a global variable,
which I am happy to have also gotten rid of.
A directive that contains an unterminated """ string should not
cause each word of the string to be treated as a bare word. Instead,
the directive should fail to parse.
There are two tests. One just checks that a complete directive
containing such a string fails to parse. The other checks for a case
where a directive ends with a very long unterminated """ string,
and the directive is itself not closed. While this test won't fail,
it does trigger a nasty perl warning.
See bug #411786. Perl's random corruption of the taint flag is even effecting
the untainting of source filenames now (which AFAICS, is a proper untaint
and always worked before..), and that makes using ikiwiki in perl taint
mode not work at all.
And avoid a whole class of potential security problems (though
none that I know of actually existing..), by avoiding
performing any string interpolation on user-supplied data when translating
pagespecs.
This works around an enormous (and, in this context, enormously confusing)
message that git has begun to print when one attempts to push changes into
a non-bare repo.
As a bonus, it now tests whether ikiwiki-makerepo works.
This may already work with other web servers that have copied apache's
interface, and it should be easy to add support to it for web servers that
use some other interface. So, make the name more general.