master
joey 2006-03-10 02:10:44 +00:00
parent 5e509b1438
commit a1997e1994
13 changed files with 565 additions and 0 deletions

6
Makefile 100644
View File

@ -0,0 +1,6 @@
all:
./ikiwiki doc html
clean:
rm -rf html
rm -f doc/.index

7
doc/ikiwiki.mdwn 100644
View File

@ -0,0 +1,7 @@
IkiWiki is the engine driving this wiki, which exists to document ikiWiki.
The [[index]] is where you'll find actual useful info about it.
Why call it IkiWiki? Well, partly because I'm sure some people will find
this a pretty Iky Wiki, since it's so different from other Wikis. Partly
because "ikiwiki" is a nice palindrome. Partly because its design turns
the usual design for a Wiki inside-out and backwards.

36
doc/index.mdwn 100644
View File

@ -0,0 +1,36 @@
[[Ikiwiki]] is a wiki compiler. It converts a directory full of wiki pages
into html pages suitable for publishing on a website. Unlike a traditional
wiki, ikiwiki does not have its own means of storing page history, its own
markup language, or support for editing pages online.
To use [[ikiwiki]] to set up a wiki, you will probably want to use it with a
revision control system, such as [[Subversion]], for keeping track of past
versions of pages. ikiwiki can run as a Subversion post-commit hook, so
that each committed change to your wiki is immediatly compiled and
published. (It can also be run by hand, by cron, or integrated with any
other revision control system.)
[[Subversion]] also offers a way to let others edit pages on your wiki.
Just configure subversion to let appropriate users (or everyone) commit to
the wiki's repository. There are some things you should keep in mind about
[[Security]] when allowing the world to edit your ikiwiki.
ikiwiki supports pages using [[MarkDown]] as their markup language. Any
page with a filename ending in ".mdwn" is converted from markdown to html
by ikiwiki. Markdown understands text formatted as it would be in an email,
and is quite smart about converting it to html. The only additional markup
provided by ikiwiki aside from regular markdown is the [[WikiLink]].
ikiwiki also supports files of any other type, including raw html, text,
images, etc. These are not converted to wiki pages, they are just copied
unchanged by ikiwiki as it builds your wiki. So you can check in an image,
program, or other special file and link to it from your wiki pages.
ikiwiki also supports making one page that is a [[SubPage]] of another.
[[TODO]] lists things that need to be added to ikiwiki before most people
would consider it a full-fledged wiki.
All wikis are supposed to have a [[SandBox]], so this one does to.
If you'd like to try editing pages on this wiki, do whatever you'd like in
[[ikiwiki]] is developed by JoeyHess.

View File

@ -0,0 +1,6 @@
Joey Hess is <a href="mailto:joey@kitenet.net">joey@kitenet.net</a>.
His web page is [here](http://kitenet.net/~joey/).
Joey hates programming web crap, and hates being locked into a web browser
to do something, and this probably shows in the design choices made in
ikiwiki.

View File

@ -0,0 +1,9 @@
[Markdown](http://daringfireball.net/projects/markdown/)
is a minimal markup language that resembles plain text as used in
email messages. It is the markup language used by this wiki.
For documentation about the markdown syntax, see
[Markdown: syntax](http://daringfireball.net/projects/markdown/syntax).
Note that [[WikiLink]]s are not part of the markdown syntax, and are the
only bit of markup that this wiki handles internally.

35
doc/sandbox.mdwn 100644
View File

@ -0,0 +1,35 @@
This is the SandBox, a page anyone can edit to try out ikiwiki.
See [[MarkDown]] for documentation of the markup syntax used on this page.
----
Here's a paragraph.
Here's another one.
# Header
## Subheader
> This is a blockquote.
>
> This is the first level of quoting.
>
> > This is nested blockquote.
>
> Back to the first level.
Numbered list
1. First item.
1. Another.
1. And another..
Bulleted list
* item
* item
* item
Link back to the [[index]].

39
doc/security.mdwn 100644
View File

@ -0,0 +1,39 @@
If you are using ikiwiki to render pages that only you can edit, then there
are no more security issues with this program than with cat(1). If,
however, you let others edit pages in your wiki, then some security issues
do need to be kept in mind.
## html attacks
ikiwiki does not attempt to do any santization of the html on the wiki.
MarkDown allows embedding of arbitrary html into a markdown document. If
you let anyone else edit files on the wiki, then anyone can have fun exploiting
the web browser bug of the day. This type of attack is typically referred
to as an XSS attack ([google](http://www.google.com/search?q=xss+attack)).
## image files etc attacks
If it enounters a file type it does not understand, ikiwiki just copies it
into place. So if you let users add any kind of file they like, they can
upload images, movies, windows executables, etc. If these files exploit
security holes in the browser of someone who's viewing the wiki, that can
be a security problem.
## exploting ikiwiki with bad content
Someone could add bad content to the wiki and hope to exploit ikiwiki.
Note that ikiwiki runs with perl taint checks on, so this is unlikely;
the only data that is not subject to full taint checking is the names of
files, and filenames are sanitised.
## cgi scripts
ikiwiki does not allow cgi scripts to be published as part of the wiki. Or
rather, the script is published, but it's not marked executable, so
hopefully your web server will not run it.
## web server attacks
If your web server does any parsing of special sorts of files (for example,
server parsed html files), then if you let anyone else add files to the wiki,
they can try to use this to exploit your web server.

11
doc/subpage.mdwn 100644
View File

@ -0,0 +1,11 @@
[[ikiwiki]] supports placing pages in a directory hierarchy. For example,
this page, [[SubPage]] has some related pages placed under it, like
[[SubPage/LinkingRules]]. This is a useful way to add some order to your
wiki rather than just having a great big directory full of pages.
To add a SubPage, just make a subdirectory and put pages in it. For
example, this page is SubPage.mdwn in this wiki's source, and there is also
a SubPage subdirectory, which contains SubPage/LinkingRules.mdwn. Subpages
can be nested as deeply as you'd like.
Linking to and from a SubPage is explained in [[LinkingRules]].

View File

@ -0,0 +1,21 @@
To link to or from a [[SubPage]], you can normally use a regular
[[WikiLink]] that does not contain the name of the parent directory of
the [[SubPage]]. Ikiwiki descends the directory hierarchy looking for a
page that matches your link.
For example, if FooBar/SubPage links to "OtherPage", ikiwiki will first
prefer pointing the link to FooBar/SubPage/OtherPage if it exists, next
to FooBar/OtherPage and finally to OtherPage in the root of the wiki.
Note that this means that if a link on FooBar/SomePage to "OtherPage"
currently links to OtherPage, in the root of the wiki, and FooBar/OtherPage
is created, the link will _change_ to point to FooBar/OtherPage. On the
other hand, a link from BazBar to "OtherPage" would be unchanged by this
creation of a [[SubPage]] of FooBar.
You can also specify a link that contains a directory name, like
"FooBar/OtherPage" to more exactly specify what page to link to. This is
the only way to link to an unrelated [[SubPage]].
You can use this to, for example, to link from BazBar to "FooBar/SubPage",
or from BazBar/SubPage to "FooBar/SubPage".

View File

@ -0,0 +1,3 @@
Subversion is a revision control system. While ikiwiki is relatively
independant of the underlying revision control system, using it with
Subversion is recommended.

43
doc/todo.mdwn 100644
View File

@ -0,0 +1,43 @@
## online page editing
To support editing pages in a web browser, a CGI script is needed that
pulls the page out of [[Subversion]], presents it to the user for editing,
and then commits the changed page back to [[Subversion]].
Due to [[WikiSpam]], this will probably also need to incorporate a user
registration system. So there will need to be a script that handles logins
and registrations, sets a cookie, and the page editor can refuse to edit
pages for users who arn't logged in, and include a not of who made the
change in the svn log.
If possible I'd prefer to use someone else's generic web user registration
and login system, if one exists.
## [[RecentChanges]]
This will need to be another cgi script, that grubs through the
[[Subversion]] logs.
This should support RSS for notification of new and changed pages.
## page history
To see past versions of a page, we can either implement a browser for that,
or just provide a way to link to the page in viewcvs.
## pluggable renderers
I'm considering a configurable rendering pipeline for each supported
filename extension. So for ".mdwn" files, it would send the content through
linkify, markdown, and finalize, while for ".wiki" files it might send it
through just a wiki formatter and finalize.
This would allow not only supporting more types of markup, but changing
what style of [[WikiLink]]s are supported, maybe some people want to add
[[CamelCase]] for example, or don't like the [[SubPage/LinkingRules]].
The finalize step is where the page gets all the pretty junk around the
edges, so that clearly needs to be pluggable too.
There could also be a step before finalize, where stuff like lists of pages
that linked back to it could be added to the page.

View File

@ -0,0 +1,9 @@
WikiLinks provide easy linking between pages of the wiki. To create a
WikiLink, just put the name of the page to link to in double brackets. For
examples "[[ WikiLink ]]" (without the added whitespace).
Note that there are some special [[SubPage/LinkingRules]] that come into
play when linking between [[SubPage]]s.
WikiLinks can be entered in any case you like, the page they link to is
always lowercased.

340
ikiwiki 100755
View File

@ -0,0 +1,340 @@
#!/usr/bin/perl -T
use warnings;
use strict;
use File::Find;
use Memoize;
use File::Spec;
BEGIN {
$blosxom::version="is a proper perl module too much to ask?";
do "/usr/bin/markdown";
}
memoize('pagename');
memoize('bestlink');
my ($srcdir)= shift =~ /(.*)/; # untaint
my ($destdir)= shift =~ /(.*)/; # untaint
my $link=qr/\[\[([^\s]+)\]\]/;
my $verbose=1;
my %links;
my %oldpagemtime;
my %renderedfiles;
sub error ($) {
die @_;
}
sub debug ($) {
print "@_\n" if $verbose;
}
sub mtime ($) {
my $page=shift;
return (stat($page))[9];
}
sub basename {
my $file=shift;
$file=~s!.*/!!;
return $file;
}
sub dirname {
my $file=shift;
$file=~s!/?[^/]+$!!;
return $file;
}
sub pagetype ($) {
my $page=shift;
if ($page =~ /\.mdwn$/) {
return ".mdwn";
}
else {
return "unknown";
}
}
sub pagename ($) {
my $file=shift;
my $type=pagetype($file);
my $page=$file;
$page=~s/\Q$type\E*$// unless $type eq 'unknown';
return $page;
}
sub htmlpage ($) {
my $page=shift;
return $page.".html";
}
sub readpage ($) {
my $page=shift;
local $/=undef;
open (PAGE, "$srcdir/$page") || error("failed to read $page: $!");
my $ret=<PAGE>;
close PAGE;
return $ret;
}
sub writepage ($$) {
my $page=shift;
my $content=shift;
my $dir=dirname("$destdir/$page");
if (! -d $dir) {
my $d="";
foreach my $s (split(m!/+!, $dir)) {
$d.="$s/";
if (! -d $d) {
mkdir($d) || error("failed to create directory $d: $!");
}
}
}
open (PAGE, ">$destdir/$page") || error("failed to write $page: $!");
print PAGE $content;
close PAGE;
}
sub findlinks {
my $content=shift;
my @links;
while ($content =~ /$link/g) {
push @links, lc($1);
}
return @links;
}
# Given a page and the text of a link on the page, determine which existing
# page that link best points to. Prefers pages under a subdirectory with
# the same name as the source page, failing that goes down the directory tree
# to the base looking for matching pages.
sub bestlink ($$) {
my $page=shift;
my $link=lc(shift);
my $cwd=$page;
do {
my $l=$cwd;
$l.="/" if length $l;
$l.=$link;
if (exists $links{$l}) {
#debug("for $page, \"$link\", use $l");
return $l;
}
} while $cwd=~s!/?[^/]+$!!;
print STDERR "warning: page $page, broken link: $link\n";
return "";
}
sub isinlinableimage ($) {
my $file=shift;
$file=~/\.(png|gif|jpg|jpeg)$/;
}
sub htmllink ($$) {
my $page=shift;
my $link=shift;
my $bestlink=bestlink($page, $link);
return $page if $page eq $bestlink;
if (! grep { $_ eq $bestlink } values %renderedfiles) {
$bestlink=htmlpage($bestlink);
}
if (! grep { $_ eq $bestlink } values %renderedfiles) {
return "<a href=\"?\">?</a>$link"
}
$bestlink=File::Spec->abs2rel($bestlink, dirname($page));
if (isinlinableimage($bestlink)) {
return "<img src=\"$bestlink\">";
}
return "<a href=\"$bestlink\">$link</a>";
}
sub linkify ($$) {
my $content=shift;
my $file=shift;
$content =~ s/$link/htmllink(pagename($file), $1)/eg;
return $content;
}
sub htmlize ($$) {
my $type=shift;
my $content=shift;
if ($type eq '.mdwn') {
return Markdown::Markdown($content);
}
else {
error("htmlization of $type not supported");
}
}
sub finalize ($$) {
my $content=shift;
my $page=shift;
my $title=basename($page);
$title=~s/_/ /g;
$content="<html>\n<head><title>$title</title></head>\n<body>\n".
$content.
"</body>\n</html>\n";
return $content;
}
sub render ($) {
my $file=shift;
my $type=pagetype($file);
my $content=readpage($file);
if ($type ne 'unknown') {
my $page=pagename($file);
$links{$page}=[findlinks($content)];
$content=linkify($content, $file);
$content=htmlize($type, $content);
$content=finalize($content, $page);
writepage(htmlpage($page), $content);
$oldpagemtime{$page}=time;
$renderedfiles{$page}=htmlpage($page);
}
else {
$links{$file}=[];
writepage($file, $content);
$oldpagemtime{$file}=time;
$renderedfiles{$file}=$file;
}
}
sub loadindex () {
open (IN, "$srcdir/.index") || return;
while (<IN>) {
chomp;
my ($mtime, $page, $rendered, @links)=split(' ', $_);
$oldpagemtime{$page}=$mtime;
$links{$page}=\@links;
($renderedfiles{$page})=$rendered=~m/(.*)/; # untaint
}
close IN;
}
sub saveindex () {
open (OUT, ">$srcdir/.index") || error("cannot write to .index: $!");
foreach my $page (keys %oldpagemtime) {
print OUT "$oldpagemtime{$page} $page $renderedfiles{$page} ".
join(" ", @{$links{$page}})."\n"
if $oldpagemtime{$page};
}
close OUT;
}
sub prune ($) {
my $file=shift;
unlink($file);
my $dir=dirname($file);
while (rmdir($dir)) {
$dir=dirname($dir);
}
}
sub refresh () {
# Find existing pages.
my %exists;
my @files;
find({
no_chdir => 1,
wanted => sub {
if (/\/\.svn\//) {
$File::Find::prune=1;
}
elsif (! -d $_ && ! /\.html$/ && ! /\/\./) {
my ($f)=/(^[-A-Za-z0-9_.:\/+]+$)/; # untaint
if (! defined $f) {
warn("skipping bad filename $_\n");
}
else {
$f=~s/^\Q$srcdir\E\/?//;
push @files, $f;
$exists{pagename($f)}=1;
}
}
},
}, $srcdir);
# check for added or removed pages
my @adddel;
foreach my $file (@files) {
my $page=pagename($file);
if (! $oldpagemtime{$page}) {
debug("new page $page");
push @adddel, $page;
$links{$page}=[];
}
}
foreach my $page (keys %oldpagemtime) {
if (! $exists{$page}) {
debug("removing old page $page");
prune($destdir."/".$renderedfiles{$page});
delete $renderedfiles{$page};
$oldpagemtime{$page}=0;
push @adddel, $page;
}
}
# render any updated files
foreach my $file (@files) {
my $page=pagename($file);
if (! exists $oldpagemtime{$page} ||
mtime("$srcdir/$file") > $oldpagemtime{$page}) {
debug("rendering changed file $file");
render($file);
}
}
# if any files were added or removed, check to see if each page
# needs an update due to linking to them
if (@adddel) {
FILE: foreach my $file (@files) {
my $page=pagename($file);
foreach my $p (@adddel) {
foreach my $link (@{$links{$page}}) {
if (bestlink($page, $link) eq $p) {
debug("rendering $file, which links to $p");
render($file);
next FILE;
}
}
}
}
}
}
loadindex();
refresh();
saveindex();