Merge branch 'master' of ssh://git.ikiwiki.info/srv/git/ikiwiki.info

master
Joey Hess 2011-03-21 13:47:12 -04:00
commit 14b8abe60f
2 changed files with 65 additions and 0 deletions

View File

@ -0,0 +1,53 @@
For security reasons, one of the sites I'm in charge of uses a Reverse Proxy to grab the content from another machine behind our firewall.
Let's call the out-facing machine Alfred and the one behind the firewall Betty.
For the static pages, everything is fine. However, when trying to use the search, all the links break.
This is because, when Alfred passes the search query on to Betty, the search result has a "base" tag which points to Betty, and all the links to the "found" pages are relative.
So we have
<base href="Betty.example.com"/>
...
<a href="./path/to/found/page/">path/to/found/page</a>
This breaks things for anyone on Alfred, because Betty is behind a firewall and they can't get there.
What would be better is if it were possible to have a "base" which didn't reference the hostname, and for the "found" links not to be relative.
Something like this:
<base href="/"/>
...
<a href="/path/to/found/page/">path/to/found/page</a>
The workaround I've come up with is this.
1. Set the "url" in the config to '&nbsp;' (a single space). It can't be empty because too many things complain if it is.
2. Patch the search plugin so that it saves an absolute URL rather than a relative one.
Here's a patch:
diff --git a/IkiWiki/Plugin/search.pm b/IkiWiki/Plugin/search.pm
index 3f0b7c9..26c4d46 100644
--- a/IkiWiki/Plugin/search.pm
+++ b/IkiWiki/Plugin/search.pm
@@ -113,7 +113,7 @@ sub indexhtml (@) {
}
$sample=~s/\n/ /g;
- my $url=urlto($params{destpage}, "");
+ my $url=urlto($params{destpage}, undef);
if (defined $pagestate{$params{page}}{meta}{permalink}) {
$url=$pagestate{$params{page}}{meta}{permalink}
}
It works for me, but it has the odd side-effect of prefixing links with a space. Fortunately that doesn't seem to break browsers.
And I'm sure someone else could come up with something better and more general.
--[[KathrynAndersen]]
> The `<base href>` is required to be genuinely absolute (HTML 4.01 §12.4).
> Have you tried setting `url` to the public-facing URL, i.e. with `alfred`
> as the hostname? That seems like the cleanest solution to me; if you're
> one of the few behind the firewall and you access the site via `betty`
> directly, my HTTP vs. HTTPS cleanup in recent versions should mean that
> you rarely get redirected to `alfred`, because most URLs are either
> relative or "local" (start with '/'). --[[smcv]]

View File

@ -121,3 +121,15 @@ the user agent to be programmatically manipulated? --[[schmonz]]
>> Pong.. I'd be happier with a more 100% solution that let cookies be used >> Pong.. I'd be happier with a more 100% solution that let cookies be used
>> w/o needing to write a custom plugin to do it. --[[Joey]] >> w/o needing to write a custom plugin to do it. --[[Joey]]
>>> According to LWP::UserAgent, for the common case, a complete
>>> and valid configuration for `$config{cookies}` would be `{ file =>
>>> "$ENV{HOME}/.cookies.txt" }`. In the more common case of not needing
>>> to prime one's cookies, `cookie_jar` can be `undef` (that's the
>>> default). In my less common case, the cookies are generated by
>>> visiting a couple magic URLs, which would be trivial to turn into
>>> config options, except that these particular URLs rely on SPNEGO
>>> and so LWP::Authen::Negotiate has to be loaded. So I think adding
>>> `$config{cookies}` (and using it in the aggregate plugin) should
>>> be safe, might help people in typical cases, and won't prevent
>>> further enhancements for less typical cases. --[[schmonz]]