Merge branch 'master' of ssh://git.ikiwiki.info

master
Joey Hess 2013-09-30 10:46:10 -04:00
commit 7878e6d2b8
11 changed files with 279 additions and 1 deletions

View File

@ -81,3 +81,7 @@ Please tell me if you need more info. The same openid worked fine to login to *
>>>>> Investigation revealed it was a bug in the freebsd patch, which I
>>>>> understand is going to be dealt with. [[done]] --[[Joey]]
I am getting the same error here with ikiwiki 3.20120629 (wheezy). I had trouble with ikiwiki-hosting configurations of OpenID, basically related to the `openid_realm` parameter - which I had to comment out. But now it seems to fail regardless. --[[anarcat]]
> Nevermind, this was because I was blocking cookie on the CGI (!!). Message *could* be improved though, it's not the first time i stumble upon this... --[[anarcat]]

View File

@ -5,5 +5,6 @@ to handle such conversions.
* [[tips/convert_mediawiki_to_ikiwiki]]
* [[tips/convert_moinmoin_to_ikiwiki]]
* [[tips/convert_blogger_blogs_to_ikiwiki]]
* [[tips/Movable_Type_to_ikiwiki]]
In addition, [[JoshTriplett]] has written scripts to convert Twiki sites, see [his page](/users/JoshTriplett) for more information.

View File

@ -0,0 +1,65 @@
# Howto avoid heavy files in ikiwiki git repo
Continuation of discussion at [git-annex forum](http://git-annex.branchable.com/forum/git-annex___38___ikiwiki_experiment/) turns out the git-annex tricks could be avoided.
## Setup on remote server
On the server activate album and underlay plugins in $wiki.setup file
add_plugins:
- album
- underlay
Configure underlay plugin
add_underlays:
- /home/$user/$wiki.underlay
Create underlay directory and init git annex in direct mode
mkdir ~/$wiki.underlay
cd ~/$wiki.underlay;git init;git annex init $srcunderlay; git annex direct
Build ikiwiki for good measure
ikiwiki --setup $wiki.setup --rebuild
## Setup on local laptop
Clone to laptop and initialise annex repo
git clone ssh://$server/$wiki.git ~/$wiki
git clone ssh://$server/$wiki.underlay ~/$wiki.underlay
cd $wiki.underday;git-annex init $wrkunderlay
git remote add $srcunderlay ssh://$server/$wiki.underlay
You now have an annex repo in the local $wiki.underlay called $wrkunderlay and one in the $wiki.underlay directory on the remote server called $srcunderlay.
## Add content locally
Add content to local $wiki directory in this case create $album.mdwn files for every album you have. Then `git add;git commit` files containing at the minimum the following
[[!album ]]
Create directories in the local $wiki.underlay corresponding to the album files in the local $wiki dir. Ie. create a directory named $album for every $album.mdwn file. Copy hi-res jpg files to each directory in the local $wiki.underlay and add + commit.
git annex add .
git commit -m 'jpgs added'
## Push to remote
cd $wrkunderlay; git-annex copy --to $srcunderlay .; git-annex sync
cd $wrkdir;git push
That's it! Ikiwiki should update the website and treat the jpg's as if they were part of the standard file structure.
How to accomplish this using the web interface is another questions. I guess the plugins have to setup and upload to underlaydir somehow.
My guess is that you have to git-annex copy the $wiki.underlay files to $srcunderlay **before** running git push from the local $wiki directory. Haven't tested this yet though.

View File

@ -0,0 +1,18 @@
[[!comment format=mdwn
username="http://smcv.pseudorandom.co.uk/"
nickname="smcv"
subject="comment 1"
date="2013-09-26T13:11:55Z"
content="""
\"I guess the plugins have to setup and upload to underlaydir somehow\" -
yes, the hypothetical specialized CGI interface mentioned at the end of
[[plugins/contrib/album]] would ideally be able to do that.
I'd also like to be able to keep full-resolution photos on my laptop
but mangle them down to a more web-compatible resolution in a
separate underlay that is what actually gets uploaded, also as described
on that page - but that doesn't make a great deal of sense for a
non-CGI workflow, since if you're uploading full-resolution photos to
the CGI, you've already done the big data transfer whether you
intended to or not :-)
"""]]

View File

@ -0,0 +1,23 @@
[[!comment format=mdwn
username="https://www.google.com/accounts/o8/id?id=AItOawkickHAzX_uVJMd_vFJjae6SLs2G38URPU"
nickname="Kalle"
subject="comment 2"
date="2013-09-26T13:38:49Z"
content="""
I recreated this post in the tips section [Ikiwiki with git-annex, the album and the underlay plugins](http://ikiwiki.info/tips/Ikiwiki_with_git-annex__44___the_album_and_the_underlay_plugins/) as per anarcats suggestion
@smcv
> I'd also like to be able to keep full-resolution photos on my laptop
> but mangle them down to a more web-compatible resolution in a separate
> underlay that is what actually gets uploaded, also as described on that page
Yes I can see that some have use for that function. I try to provide hi-res version of all my images as they are more useful for people. For the stockholm site though I've halved the size already as there are to many photos and of to dubious quality... and it's easy enough to do that locally before uploading.
As you can see on the [about](http://stockholm.kalleswork.net/tech/) page I use a shell script to do the metadata stuff even adding the actual download link for the hi-res version. Couldn't figure out how to do in in album.pm before I opted for a simpler solution ;) So for me local work is necessary anyway atm.
Another potential problem with the underlay is how changes to underlay files would be detected?
Feel free to delete this whole forum post in favor of the version in tips.
"""]]

View File

@ -0,0 +1,17 @@
[[!comment format=mdwn
username="http://smcv.pseudorandom.co.uk/"
nickname="smcv"
subject="comment 3"
date="2013-09-26T14:38:45Z"
content="""
> \"how changes to underlay files would be detected?\"
Changes to files in underlays are picked up automatically, as long as
their mtime changed.
> Couldn't figure out how to do [metadata] in in album.pm
Yeah, I need to implement a hook mechanism or something
(and work out where I put the exif plugin mentioned in
the page).
"""]]

View File

@ -0,0 +1,17 @@
### CGI requirement when using \[\[\!waypoint\]\] on pages?
Most of the osm plugin works well without cgi. The link from waypoints however use ikiwiki.cgi, at least in my configuration. Is this actually required or is it possible to use a pre rendered page to avoid running cgi on the server?
At the moment I'd prefer not running cgi and one of the advantages of a wiki compiler is that you could run without cgi on the server.
This is a minor issue but I'd be interested if you think it's possible to prerender waypoint maps. I.e maps that center on a specific waypoint whilst having all waypoints of that map visible.
### Configure all osm tags to use same icon?
Setting the default `osm_tag_default_icon` does not seem to work? All tagged waypoint pages now want their own unique icon and display broken image if not present. Populating the tag folder with identical icons gets a bit much when there are a lot of tags.
### \[Wishlist\] Setting unigue icon for "active waypoint"
For usability it would be great if it was possible to display the active waypoint with a different icon. So that clicking a waypoint map symbol takes you to a map with lots of waypoints but the waypoint from the sources page is centered (as per current behaviour) **and** has a different icon.
*PS. The osm plugin is amazing!*

View File

@ -0,0 +1,64 @@
# Howto avoid heavy files in ikiwiki git repo
Continuation of discussion at [git-annex forum](http://git-annex.branchable.com/forum/git-annex___38___ikiwiki_experiment/) turns out the git-annex tricks could be avoided.
## Setup on remote server
On the server activate album and underlay plugins in $wiki.setup file
add_plugins:
- album
- underlay
Configure underlay plugin
add_underlays:
- /home/$user/$wiki.underlay
Create underlay directory and init git annex in direct mode
mkdir ~/$wiki.underlay
cd ~/$wiki.underlay;git init;git annex init $srcunderlay; git annex direct
Build ikiwiki for good measure
ikiwiki --setup $wiki.setup --rebuild
## Setup on local laptop
Clone to laptop and initialise annex repo
git clone ssh://$server/$wiki.git ~/$wiki
git clone ssh://$server/$wiki.underlay ~/$wiki.underlay
cd $wiki.underday;git-annex init $wrkunderlay
git remote add $srcunderlay ssh://$server/$wiki.underlay
You now have an annex repo in the local $wiki.underlay called $wrkunderlay and one in the $wiki.underlay directory on the remote server called $srcunderlay.
## Add content locally
Add content to local $wiki directory in this case create $album.mdwn files for every album you have. Then `git add;git commit` files containing at the minimum the following
[[!album ]]
Create directories in the local $wiki.underlay corresponding to the album files in the local $wiki dir. Ie. create a directory named $album for every $album.mdwn file. Copy hi-res jpg files to each directory in the local $wiki.underlay and add + commit.
git annex add .
git commit -m 'jpgs added'
## Push to remote
cd $wrkunderlay; git-annex copy --to $srcunderlay .; git-annex sync
cd $wrkdir;git push
That's it! Ikiwiki should update the website and treat the jpg's as if they were part of the standard file structure.
How to accomplish this using the web interface is another questions. I guess the plugins have to setup and upload to underlaydir somehow.
My guess is that you have to git-annex copy the $wiki.underlay files to $srcunderlay **before** running git push from the local $wiki directory. Haven't tested this yet though.

View File

@ -0,0 +1,37 @@
this script can be used to convert your existing Movable Type blog/database to a ikiwiki blog.
First, go to your MT Admin panel and purge all spam comments/trackbacks. Then use this script: <http://anti.teamidiot.de/static/nei/*/Code/MovableType/mtdump_to_iki.pl>
If you wrote your posts with markdown already you're pretty much ikiwiki compatible :-)
DATABASE_NAME=your_mt_database
DATABASE_USER=your_mysql_user
mkdir -p conv/posts
mysqldump $DATABASE_NAME -v -nt --compatible=ansi,postgresql \\
--complete-insert=TRUE --extended-insert=FALSE --compact \\
--default-character-set=UTF8 -u $DATABASE_USER \\
| perl mtdump_to_iki.pl
the script will spit out one file for every post into the conv/posts directory. you can manually clean them up or however you like. next, you must set the output directory where your ikiwiki resides:
export OUT=$HOME/my_ikiwiki_blog
make sure there is a 'posts' subdirectory inside (default if you start with the blog-setup script)
now you can import one or all posts and comments by running the post file through zsh:
zsh ./1__my_first_post.mdwn
or to do it all:
zsh
for import (<->__*.*) { zsh $import }
the files will be created in your $OUT directory and committed onto git. now the **important** last step: run
ikiwiki --gettime --setup your.setup
only with the gettime flag will ikiwiki reread the file dates as recorded in the git. Enjoy!

View File

@ -0,0 +1,32 @@
For security reasons, ikiwiki.cgi should only be accessed via HTTPS, which is easy to set in the config, however each wiki page contains
<link rel="stylesheet" href="http://ikiwiki.info/style.css" type="text/css" />
<link rel="stylesheet" href="http://ikiwiki.info/local.css" type="text/css" />
regardless of whether the site is accessed via HTTP or HTTPS, which causes most modern browsers to automatically disable javascript and complain about the site only being partially encrypted. Features such as the openID-selector stop working unless the user manually allows the browser to execute unsafe scripts on the site.
This can be fixed by setting the base wiki url to a protocol relative url, such as
//wiki.example.com
but this breaks all sorts of things, like the 404 plugin and wiki rebuilds will throw the following perl warning several times:
Use of uninitialized value in string ne at /usr/share/perl5/IkiWiki.pm line 586
> With a vaguely recent ikiwiki, if your `url` and `cgiurl` settings have the
> same hostname (e.g.
> `url => "http://www.example.com", cgiurl => "https://www.example.com/ikiwiki.cgi"`),
> most links are path-only (e.g. `/style.css`), and in particular,
> CGI-generated pages should generate those links. This was the implementation of
> [[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]].
>
> If your`$config{url}` and `$config{cgiurl}` have different hostnames (e.g.
> `url => "http://wiki.example.com", cgiurl => "http://cgi.example.com/ikiwiki.cgi"`)
> then you might still have this problem. In principle, IkiWiki could generate
> protocol-relative URLs in this situation, but it isn't clear to me how
> widely-supported those are.
>
> If you set both the `$config{url}` and `$config{cgiurl}` to https, but make
> the resulting HTML available over HTTP as well as HTTPS, that should work
> fine - accesses will be over http until the user either explicitly
> navigates to https, or navigates to the CGI. --[[smcv]]

View File

@ -1 +1 @@
There are [some issue](http://www.branchable.com/bugs/Exception:_Cannot_open_tables_at_consistent_revisions_at___47__usr__47__lib__47__perl5__47__Search__47__Xapian__47__WritableDatabase.pm_line_41./#comment-c159ea3f9be35fcd9ed0eeedb162e816) with the current search engine. Sometimes the database gets corrupted and it's not very good at weighting say, the title against the content. For example, [searching for pagespec](http://ikiwiki.info/ikiwiki.cgi?P=pagespec) in this wiki doesn't lead to the [[ikiwiki/pagespec]] page in the first page... but in the third page. In [[different_search_engine]], there was the idea of using Lucene - is there any reason why we should have both, or at least let lucene live in contrib?
There are [some issue](http://www.branchable.com/bugs/Exception:_Cannot_open_tables_at_consistent_revisions_at___47__usr__47__lib__47__perl5__47__Search__47__Xapian__47__WritableDatabase.pm_line_41./#comment-c159ea3f9be35fcd9ed0eeedb162e816) with the current search engine. Sometimes the database gets corrupted and it's not very good at weighting say, the title against the content. For example, [searching for pagespec](http://ikiwiki.info/ikiwiki.cgi?P=pagespec) in this wiki doesn't lead to the [[ikiwiki/pagespec]] page in the first page... but in the third page. In [[different_search_engine]], there was the idea of using Lucene - is there any reason why we shouldn't have both, or at least let lucene live in contrib? --[[anarcat]]