If you've found a bug in ikiwiki, post about it here. TODO items go elsewhere. Link items to done when done.
Also see the Debian bugs.
If I try to authenticate using openid to my site, it tries to create a http or https connection to the openid server. This doesn't work, because the direct connection is blocked by the firewall.
It would be good if ikiwiki supported setting up a proxy server to solve this.
I have found if I add:
newenviron[i++]="HTTPS_PROXY=http://host.domain.com:3128";
to IkiWiki/Wrapper.pm it solves the problem for https requests, however it obviously would be preferred if the proxy name is not configured.
Also, the ability to set HTTPS_CA_FILE and HTTPS_CA_DIR might benefit some people. Then again, it I can't see any evidence that the SSL certificate of the server is being checked. See the bug report I filed on this separate issue.
Unfortunately, HTTP_PROXY doesn't work for http requests, it looks like that library is different.
Brian May
Posted Sun Aug 30 21:49:15 2009Going from RecentChanges, when viewing the diffs of newly created pages (like http://tinyurl.com/35f5q3 for the Allow web edit form comment field to be mandatory page, then there should -- in my opinion -- the diff for the newly added page be shown.
Posted Sun Aug 30 21:49:15 2009I don't see any way to make gitweb do that. You can click on the filename after the "diff -cc" to see the whole file output, but gitweb won't show a diff for a newly added file. --Joey
It looks like all links in websites are absolute paths, this has some limitations:
- If connecting to website via https://... all links will take you back to http://
- Makes it harder to mirror website via HTML version, as all links have to be updated.
It would be good if relative paths could be used instead, so the transport method isn't changed unless specifically requested.
-- Brian May
Er, which absolute links are you talking about? If you view the source to this page, you'll find links such as "../favicon.ico", "../style.css", "../../", and "../". The only absolute links are to CGIs and the w3c DTD. --Joey
The problem is within the CGI script. The links within the HTML page are all absolute, including links to the css file. Having a http links within a HTML page retrieved using https upset most browsers (I think). Also if I push cancel on the edit page in https, I end up at at http page. -- Brian May
Ikiwiki does not hardcode http links anywhere. If you don't want it to use such links, change your configuration to use https consistently. --Joey
Errr... That is not a solution, that is a work around. ikiwiki does not hard code the absolute paths, but absolute paths are hard coded in the configuration file. If you want to serve your website so that the majority of users can see it as http, including in rss feeds (this allows proxy caches to cache the contents and has reduced load requirements), but editing is done via https for increased security, it is not possible. I have some ideas how this can be implemented (as ikiwiki has the absolute path to the CGI script and the absolute path to the destination, it should be possible to generate a relative path from one to the other), although some minor issues still need to be resolved. -- Brian May
I noticed the links to the images on http://ikiwiki.info/recentchanges/ are also absolute, that is http://ikiwiki.info/wikiicons/diff.png; this seems surprising, as the change.tmpl file uses <TMPL_VAR BASEURL> which seems to do the right thing in page.tmpl, but not for change.tmpl. Where is BASEURL set? -- Brian May
Posted Sun Aug 30 21:49:15 2009The use of an absolute baseurl in change.tmpl is a special case. --Joey
When you click on a broken link to create a new page, Ikiwiki lower-cases the new page's filename. I wish it wouldn't.
If I click on "Czars in Russia", I'd like Ikiwiki to create "Czars_in_Russia.mdwn", not "czars_in_russia.mdwn". Is this possible? --sabr
Posted Sun Aug 30 21:49:15 2009There's a simple patch that can do this:
--- a/IkiWiki.pm +++ b/IkiWiki.pm @@ -584,7 +584,7 @@ sub htmllink ($$$;@) { #{{{ return " "create", - page => pagetitle(lc($link), 1), + page => pagetitle($link, 1), from => $lpage ). "\">?$linktext"
This is fine if you don't mind mixed or randomly cased filenames getting created. Otoh, if the link happened to start a sentence and so had its first letter upper-cased, that might not be desired.
Of course ikiwiki's case insensative, and there are other ways of creating pages that don't lower case them, including using the create a page form on a blog (as was done for this page..).
I'm undecided about making the above change by default though, or about making it a config option. Maybe it would be better to include both capitalisations in the select list that is used to pick the name for the newly created page. Then, which one is the default wouldn't much matter. (The non-lower cased one would probably be the best choice.) --Joey
Either of your proposed solutions (make it the default or include both in the pop-up menu) sounds fine to me. Which one is easier?
--sabr
Please have a look at
http://www.bddebian.com/~wiki/hurd/running/debian/faq/.
There is (on a sufficiently small display) a large free spacing between the
vmstat line and the first Posted line.
Even without any local.css
.
This is because of clear: both
in ikiwiki's style.css
, line 109,
for .pagedate, .pagelicense, .pagecopyright
.
I can override this in local.css
, but what was the original reason for
adding this clear: both
?
Posted Sun Aug 30 21:49:15 2009Without investigating in detail, I think it was probably because any of the items can be enabled or disabled. So any of them might be the first item after the horizontal rule, and any might be the last item before the modification date. So all of them have to clear both above and below. I'm sure there are better ways for the CSS to handle that. --Joey
I'm using the tags plugin with tagbase="tags".
Already existing tags, corresponding to pages like tags/foo.html work just fine.
If I add to a page a tag which is not existing (e.g. with ) the just modified page will have a link which point to tags/newtag. This is in theory correct, but in practice leads to creating a tags/newtag subpage of the page I'm editing, while my tagbase is supposed to be relative to the wiki root.
When used in a wiki which already have some tags this leads to mixing up tags located in tags/ and tags located in whatever/tags/.
Posted Sun Aug 30 21:49:15 2009When a new page is being edited, ikiwiki lets you chose where the page will be created, so you'll get a dropdown list of possible locations for the tag page. You should be able to choose between either tags/foo or page/tags/foo.
The way that ikiwiki decides which location to default to in this box is fairly involved; but in general it will default to creating the page at the same level as the tagged page. So if the tag is on any toplevel page in the wiki, it will default to creating
tags/foo
; if the tag is on a page in a subdirectory, it will default to creating subdir/tags/foo.I personally like this behavior; it allows me to create a subdirectory for a particular thing and use tags that are specific to that thing, which are kept confined to that subdirectory by default. For example, this is used for ikiwiki's own plugins tags, which are all located under plugins/type/* and which apply to pages under plugins/*.
It's clearly not the right default for every situation though. Explcitly setting a tagbase probably lessons the likelyhood that it's the right default for things under that tag base. I'd not be opposed to adding a special case to change the default in this case, or adding a configuration option to change the default globally. On the other hand, it is pretty simple to just check the location and select the right one from the list when creating a new page..
--Joey
The Atom and RSS templates use ESCAPE=HTML
in the title elements. However, HTML-escaped characters aren't valid according to http://feedvalidator.org/.
Removing ESCAPE=HTML
works fine, but I haven't checked to see if there are any characters it won't work for.
For Atom, at least, I believe adding type="xhtml"
to the title element will work. I don't think there's an equivalent for RSS.
Posted Sun Aug 30 21:49:15 2009Removing the ESCAPE=HTML will not work, feed validator hates that just as much. It wants rss feeds to use a specific style of escaping that happens to work in some large percentage of all rss consumers. (Most of which are broken). http://www.rssboard.org/rss-profile#data-types-characterdata There's also no actual spec about how this should work.
This will be a total beast to fix. The current design is very clean in that all (well, nearly all) xml/html escaping is pushed back to the templates. This allows plugins to substitute fields in the templates without worrying about getting escaping right in the plugins -- and a plugin doesn't even know what kind of template is being filled out when it changes a field's value, so it can't do different types of escaping for different templates.
The only reasonable approach seems to be extending HTML::Template with an ESCAPE=RSS and using that. Unfortunately its design does not allow doing so without hacking its code in several places. I've contacted its author to see if he'd accept such a patch.
(A secondary bug is that using meta title currently results in unnecessry escaping of the title value before it reaches the template. This makes the escaping issues show up much more than they need to, since lots more characters are currently being double-escaped in the rss.)
--Joey
Update: Ok, I've fixed this for titles, as a special case, but the underlying problem remains for other fields in rss feeds (such as author), so I'm leaving this bug report open. --Joey
In markdown syntax, none of the other special characters get processed
inside a code block. However, in ikiwiki, wiki links and
preprocessor directives still get processed
inside a code block, requiring additional escaping. For example, [links
don't work](#here)
, but a <a href="./ikiwiki/wikilink.html">wikilink</a> becomes HTML
. --JoshTriplett
Indented lines provide a good way to escape a block of text containing markdown syntax, but ikiwiki links like [[this]] are still interpreted within such a block. I think that intepretation should not be happening. That is I should be able to write:
this
and have it render like:
[[this]]
--cworth
Posted Sun Aug 30 21:49:15 2009Has there been any progress or ideas on this bug recently? I use an expanded CamelCase regexp, and without much escaping in freelink text, or url links, or in codeblocks I get IkiWiki's attempt at creating a "link within a link".
I have no ideas other than perhaps once IkiWiki encounters [[ or the position is reset with a backreference from a CamelCased word, further processing of wikilinks is disabled until the position is reset and a "no not makelinks" flag or variable is cleared.
I've come up with some really ugly workarounds to handle case specific stuff like codeblocks but the problem creeps up again and again in unexpected places. I'd be happy to come up with a patch if anyone has a bright idea on a nice clean way (in theroy) to fix this. I'm out of ideas.
--CharlesMauch
I've moved the above comment here because it seems to be talking about this bug, not the similar Smileys bug.
In the case of either bug, no, I don't have an idea of a solution yet. --Joey
I've now solved a similar bug involving the smiley plugin. The code used there should give some strong hints how to fix this bug, though I haven't tried to apply the method yet. --Joey
ikiwiki will generate html formatted error messages to the command line if --cgi is set, even if it's not yet running as a cgi
Posted Sun Aug 30 21:49:15 2009Following a diff link from RecentChanges, the log message shown will not be the one of the actual commit, but the one of some previous commit, in most cases of the one which was installed directly before the current commit.
Posted Sun Aug 30 21:49:15 2009Is there some way to make gitweb show a diff with the right message? I don't see one, except for diffs that show all changes in the commit, rather than only changes to a single file. This feels like a bug in gitweb. --Joey
(Note: feel free to say "not a bug" or offer a workaround rather than changing ikiwiki.)
As reported by a Windows user trying ikiwiki: because Windows doesn't support filenames with colons, he couldn't check out the ikiwiki svn repository. More generally, ikiwiki doesn't encode colons in filenames for wiki pages, but to support Windows users perhaps it should.
Windows does not support filenames containing any of these characters: / \ * : ? " < > |
Posted Sun Aug 30 21:49:15 2009I take it this is a problem when checking out a wiki in windows, not when browsing to urls that have colons in them from windows? --Joey
Correct. You can't directly check out a wiki's repository from Windows if it includes filenames with those characters; you will get errors on those filenames.
(Moved from discussion)
I've just upgraded to ikiwiki 2.50 with the recentchanges
plugin enabled, and
figured out that I have to turn on rss
in ikiwiki.setup
in order to get the
recentchanges feed. Now the feed shows up, but the links in the feed go to the
change pages, e.g. recentchanges/change_1700.html
. I can see a recentchanges
directory created in the working copy, containing files like change_1700._change
but for some reason they are not getting htmlized and carried over. I can see
in recentchanges.pm
that it explicitly registers an htmlize
hook for the
_change
type, but something isn't happening. I also see return if $type=~/^_/;
in
render()
in Render.pm
so I guess the upshot is I'm not sure how this is
supposed to work; is there a bug here or just something I overlooked that I need
to turn on? --Chapman Flack
Posted Sun Aug 30 21:49:15 2009It's a (minor) bug that recentchanges optimises away generating the change pages, but that the rss/atom feed still links to them. --Joey
Hmm, ok, what's the intended correct behavior? To really generate the change pages, or to change the links in the feed to point somewhere else that's not missing? If you can easily point me to the right neighborhood in the code I might work on a patch for this. It may be a (minor) bug in the grand scheme of things, but it does seem pretty goofy if you've just clicked an RSS link.
--Chap (p.s. should this be moved to bugs?)
The latter -- I think making the permalink point to "recentchanges#someid" will probably work. Probably first by addressing the todo about ability to force particular UUIDs on blog posts, and then by just using that new ability in the page. --Joey
Ah. The prerequisite todo looks like more than I'd like to take on. In the meantime, would it be very involved to change whatever bug now optimizes away the change pages, or to simply have all the links in the feed point to the recentchanges page itself, with no fragment id? Either would be a bit nicer than having broken links in the feed. --Chap
The IkiWiki::pagetitle
function does not respect title changes via meta.title
. It really should, so that links rendered with htmllink
get the proper title in the link text.
--madduck
Posted Sun Aug 30 21:49:15 2009[[debbug 123456]]
expands to "bug #123456", which is hyperlinked. Could you please drop the leading "bug", for two reasons?
First, #123456 is not a bug, it's a bug report. And second, #123456 suffices, doesn't it? By hardcoding the "bug" in there, you make it impossible to start a sentence with a bug number, e.g.:
There are problems with code. #123456 is a good example of...
instead of
There are problems with code. bug #123456 is a good example of...
Thanks, --madduck
Posted Sun Aug 30 21:49:15 2009I'm experimenting with using Ikiwiki as a feed aggregator.
The Planet Ubuntu RSS 2.0 feed (http://planet.ubuntu.com/rss20.xml) as of today has someone whose name contains the character u-with-umlaut. In HTML 4.0, this is specified as the character entity uuml. Ikiwiki 2.47 running on Debian etch does not seem to understand that entity, and decides not to un-escape any markup in the feed. This makes the feed hard to read.
The following is the test input:
<rss version="2.0">
<channel>
<title>testfeed</title>
<link>http://example.com/</link>
<language>en</language>
<description>example</description>
<item>
<title>ü</title>
<guid>http://example.com</guid>
<link>http://example.com</link>
<description>foo</description>
<pubDate>Tue, 27 May 2008 22:42:42 +0000</pubDate>
</item>
</channel>
</rss>
When I feed this to ikiwiki, it complains: "processed ok at 2008-05-29 09:44:14 (invalid UTF-8 stripped from feed) (feed entities escaped"
Note also that the test input contains only pure ASCII, no UTF-8 at all.
If I remove the ampersand in the title, ikiwiki has no problem. However, the entity is valid HTML, so it would be good for ikiwiki to understand it. At the minimum, stripping the offending entity but un-escaping the rest seems like a reasonable thing to do, unless that has security implications.
Posted Sun Aug 30 21:49:15 2009I tested on unstable, and ikiwiki handled that sample rss fine, generating a
ü.html
. --JoeyI confirm that it works with ikiwiki 2.50, at least partially. The HTML output is OK, but the aggregate plugin still reports this:
processed ok at 2008-07-01 21:24:29 (invalid UTF-8 stripped from feed) (feed entities escaped)
I hope that's just a minor blemish. --liw
I am using mercurial as RCS backend and ikiwiki 2.40.
It seems that, when adding a blog post, it is not immediately commited to the mercurial repo. I have a page with this directive:
[[!inline pages="journal/blog2008/* and !*/Discussion" show="0" feeds="no" actions="yes" rootpage="journal/blog2008"]]
When I add a blog post, I see it on the wiki but it doesn't appear on History
or RecentChanges
. If I run hg status
on the wiki source dir, I see the new file has been marked as A
(ie, a new file that has not been commited).
If I then edit the blog post, then the file gets commited and I can see the edit on History
and RecentChanges
. The creation of the file remains unrecorded. --buo
Posted Sun Aug 30 21:49:15 2009Ikiwiki calls
rcs_add()
if the page is new, followed byrcs_commit()
. For mercurial, these run respectivelyhg add
andhg commit
. If the add or commit fails, it will print a warning to stderr, you might check apache's error.log to see if there's anything there. --JoeyThe problem was using accented characters (é, í) on the change comments. I didn't have an UTF-8 locale enabled in my setup file. By coincidence this happened for the first time in a couple of consecutive blog posts, so I was mistaken about the root of the problem. I don't know if you will consider this behavior a bug, since it's strictly speaking a misconfiguration but it still causes ikiwiki's mercurial backend to fail. A quick note in the docs might be a good idea. For my part, please close this bug, and thanks for the help. --buo
So, in a non-utf8 locale, mercurial fails to commit if the commit message contains utf8? --Joey
(Sorry for the delay, I was AFK for a while.) What I am seeing is this: in a non-utf8 locale, using mercurial "stand-alone" (no ikiwiki involved), mercurial fails to commit if the commit message has characters such as á. If the locale is utf8, mercurial works fine (this is with mercurial 1.0).
However, the part that seems a bit wrong to me, is this: even if my locale is utf8, I have to explicitly set a utf8 locale in the wiki's setup file, or the commit fails. It looks like ikiwiki is not using this machine's default locale, which is utf8. Also, I'm not getting any errors on apache's error log.
Wouldn't it make sense to use the machine's default locale if 'locale' is commented out in the setup file?
Please see
http://vcs-pkg.org/planet/. The headers of posts link to the HTML pages, which ikiwiki scraped. Also, the [[meta]]
titles and author directives aren't processed, but included inline. I believe that the headers should link to the posts directly, not the "cached" copies ikiwiki keeps around.
What's also not ideal is that the cached copies can be edited. Any edits there will never make it to the VCS and thus won't show up in recentchanges.
Posted Sun Aug 30 21:49:15 2009It would be nice if the aggregate plugin would try to extract the m/ctime out of each post and touch the files on the filesystem appropriately, so that ikiwiki reflects the actual time of the post via the inline plugin, rather than the time when the aggregation ran to pull the post in. --madduck
Like this? (Existing code in aggregate.pm...) --Joey
# Set the mtime, this lets the build process get the right creation
# time on record for the new page.
utime $mtime, $mtime, pagefile($guid->{page})
if defined $mtime && $mtime <= time;
Posted Sun Aug 30 21:49:15 2009I'll have to debug this, it's not working here... and this is an ikiwiki aggregator scraping another ikiwiki site.
A lot of strings in ikiwiki are hardcoded and not taken for locales resources through gettext. This is bad because ikiwiki is thus difficult to spread for non-english users.
I mean that, for instance in CGI.pm, line like:
my @buttons=("Save Page", "Preview", "Cancel");
should be written as
my @buttons=(gettext("Save Page"), gettext("Preview"), gettext("Cancel"));
Yes, these need to be fixed. But note that the localised texts come back into ikiwiki and are used in various places, including plugins. Including, possibly, third-party plugins. So localising the buttons would seem to require converting from the translations back into the C locale when the form is posted. --Joey
Wouldn't it be more easy to change all calls to the corrects ones (including in plugins) ? For instance in the same file (CGI.pm):
elsif ($form->submitted eq gettext("Save Page")) {
. That way no conversion to the C locale is needed. gettext use should just be publicized in documentation (at least in write). --bbbIt would be easy, but it could break third-party plugins that hardcode the english strings. It's also probably less efficient to run gettext over and over. --Joey
In standards templates things seems wrongly written too. For instance in page.tmpl line like:
<li><a href="<TMPL_VAR EDITURL>" rel="nofollow">Edit</a></li>
should be written as
<li><a href="<TMPL_VAR EDITURL>" rel="nofollow"><TMPL_VAR EDITURL_TEXT</a></li>
with EDITURL_TEXT variable initialized in Render.pm through a gettext call.
Am I wrong ?
No, that's not a sane way to localise the templates. The templates can be translated by making a copy and modifying it, or by using a tool to generate .mo files from the templates, and generate translated templates from .po files. (See l10n for one attempt.) But pushing the localisation of random strings in the templates through the ikiwiki program defeats the purpose of having templates at all. --Joey
If not I can spend some time preparing patches for such corrections if it can help.
-- bbb
Posted Sun Aug 30 21:49:15 2009If a file in the srcdir is removed, exposing a file in the underlaydir, ikiwiki will not notice the change and rebuild it until the file in the underlaydir gets a mtime newer than the mtime the removed file had.
Relatedly, if there are two files with different extensions that build a page with the same name, in a directory, ikiwiki will update the page whenever either changes, using the changed one as the source. But if that most recently changed one is removed, it won't rebuild the page using the older one as the source.
Posted Sun Aug 30 21:49:15 2009lockedit adds the form fields for a pagespec to preferences. This pagespec should be supplied "raw"; i.e., without quotes around it. Inexperienced users (such as myself) may provide an invalid pagespec, such as one with quotes on it. This will be merrily accepted by the form, but will cause no locking to take place.
Perhaps some validation should be performed on the pagespec and the form-submission return include "warning: this pagespec is invalid" or "warning: this pagespec does not match any existing pages" or similar.
Posted Sun Aug 30 21:49:15 2009The brokenlinks plugin falsely complains that formatting has a broken link to smileys, if the smiley plgin is disabled. While the page links to it inside a conditional, and so doesn't show the link in this case, ikiwiki scans for links w/o looking at conditionals and so still thinks the page contains the link.
Posted Sun Aug 30 18:59:20 2009After installing IkiWiki 2.16 on Mac OS X 10.4 server I attempted to use "/Library/Application\ Support/IkiWiki/Working\ Copies" as the parent of my $SRCPATH and get "skipping bad filename" errors for any .mdwn file in that directory:
skipping bad filename /Library/Application Support/IkiWiki/Working Copies/ikiwikinewt/index.mdwn
Tthe .ikiwiki directory is correctly created in that directory. I switched to using a path with no spaces and it works correctly.
Posted Sun Aug 30 18:59:20 2009I have set
templatedir => "/srv/apache2/madduck.net/templates",
in ikiwiki.setup and put a custom page.tmpl
in there, then called ikiwiki --setup
and verified that it works. It even works when I push to the Git repo and let the receive-hook update the wiki.
However, when I make a change via the CGI (which has been created by the last setup run), it applies the default page.tmpl
file to all pages it updates.
Posted Sun Aug 30 18:59:20 2009This issue can arise in at least two ways:
- A permissions problem with the templatedir that prevents ikiwiki from accessing it. If it can't access it, it silently falls back to using templates from the default directory.
- A templatedir that doesn't have an absolute path. In this case ikiwiki will look relative to somewhere, which will sometimes work and sometimes not. Clearly not a good idea.
So far that's the only ways that I can see that this could happen. It would be possible to make ikiwiki try to detect these sorts of problems; it could check if the templatedir exists, and check that it's readable. This would add some extra system calls to every ikiwiki run, and I'm not convinced it's worth it. --Joey
If you create a foo.rst containing only a number, such as "11", rendering results in the following error being thrown. (Now that I've fixed the error throwing code..):
exceptions.TypeError:coercing to Unicode: need string or buffer, int found
--Joey
Does this patch against proxy.py help?
index 5136b3c..545e226 100755
--- a/plugins/proxy.py
+++ b/plugins/proxy.py
@@ -88,7 +101,7 @@ class _IkiWikiExtPluginXMLRPCHandler(object):
@staticmethod
def _write(out_fd, data):
- out_fd.write(data)
+ out_fd.write(str(data))
out_fd.flush()
@staticmethod
Posted Sun Aug 30 18:59:20 2009No, still the same failure. I think it's failing parsing the input data, (which perl probably transmitted as an int due to perl internals) not writing out its response. --Joey
Pages under templates/ are invalid (in fact, not only invalid, but also not well-formed) xhtml pages.
This problem is especially serious when you change extension from .html to .xhtml in ikiwiki.setup and use Firefox. Since Firefox will display a error message only for not well-formed application/xhtml+xml pages.
It seems that HTML::Template also support <!--Variable-->
syntax instead
of <Variable>
. Chaning to this syntax will solve this problem, I guess.
Even if changed to <!-- TMPL_VAR -->
style, the problem may still exist if the template contains if else block.
Maybe just encode all < and > when compling pages within the templates folder will solve this problem.
Posted Sun Aug 30 18:59:20 2009I never noticed this bug, since it only happens if the htmlscrubber is disabled. --Joey
If sandbox/page.mdwn has been generated and sandbox/sidebar.mdwn is created, the sidebar is only added to sandbox and none of the subpages. --TaylorKillian
Yes, a known bug. As noted in the code: --Joey
# FIXME: This isn't quite right; it won't take into account
# adding a new sidebar page. So adding such a page
# currently requires a wiki rebuild.
add_depends($page, $sidebar_page);
Posted Sun Aug 30 18:59:20 2009
Just saw a bug with the git backend and utf8. I was committing to ikiwiki and the post-commit hook emitted a warning message about some text from git not being encoded as proper utf-8. I've lost the message, but it was from line 36 of git.pm. After a couple other commits, the message stopped appearing.
Probably git's output needs to be force encoded to utf-8. --Joey
Posted Sun Aug 30 18:59:20 2009This bug is described here:
http://kitenet.net/~joey/blog/entry/OpenID/discussion/
Posted Sun Aug 30 18:59:20 2009The header of subpages always links to its "superpage", even if it doesn't exist. I'm not sure if this is a feature or a bug, but I would certainly prefer that superpages weren't mandatory.
For example, if you are in 'example/page.html', the header will be something like 'wiki / example / page'. Now, if 'example.html' doesn't exist, you'll have a dead link for every subpage.
This is a bug, but fixing it is very tricky. Consider what would happen if example.mdwn were created: example/page.html and the rest of example/* would need to be updated to change the parentlink from a bare work to a link to the new page. Now if example.mdwn were removed again, they'd need to be updated again. So example/* depends on example. But it's even more tricky, because if example.mdwn is modified, we don't want to rebuild example/*!
ikiwiki doesn't have a way to represent this dependency and can't get one without a lot of new complex code being added.
For now the best thing to do is to make sure that you always create example if you create example/foo. Which is probably a good idea anyway..
Note that this bug does not exist if the wiki is built with the "usedirs" option, since in that case, the parent link will link to a subdirectory, that will just be missing the index.html file, but still nicely usable.
Posted Sun Aug 30 18:59:20 2009Has bugs updating things if the bestlink of a page changes due to adding/removing a page. For example, if Foo/Bar links to "Baz", which is Foo/Baz, and Foo/Bar/Baz gets added, it will update the links in Foo/Bar to point to it, but will forget to update the linkbacks in Foo/Baz.
And if Foo/Bar/Baz is then removed, it forgets to update Foo/Bar to link back to Foo/Baz.
As of 1.33, this is still true. The buggy code is the %linkchanged calculation in refresh(), which doesn't detect that the link has changed in this case.
Still true in 1.43 although the code is much different now..
Posted Sun Aug 30 18:59:20 2009Web browsers don't word-wrap lines in submitted text, which makes editing a
page that someone wrote in a web browser annoying (gqip
is vim user's
friend here). Is there any way to improve this?
Posted Sun Aug 30 18:59:20 2009See "using the web interface with a real text editor" on the tips page. --JoshTriplett
Would it be useful to allow a "max width" plugin, which would force on commit the split of long lines ?
Please, no. That would wreak havoc on code blocks and arguments to preprocessor directives, and it would make bulleted lists and quoted blocks look bogus (because the subsequent lines would not match), among other problems. On the other hand, if you want to propose a piece of client-side JavaScript that looks at the active selection in a text area and word-wraps it, and have a plugin that adds a "Word-Wrap Selection" button to the editor, that seems fine. --JoshTriplett
I have perl 5.10.0. Ikiwiki 2.44 compiles fine. Compiling 2.45 fails after 'make':
perl -Iblib/lib ikiwiki.out -libdir . -setup docwiki.setup -refresh
refreshing wiki..
docwiki.setup: Failed to load plugin IkiWiki::Plugin::goodstuff: Failed to load plugin IkiWiki::Plugin::shortcut: Too many arguments for IkiWiki::srcfile at IkiWiki/Plugin/shortcut.pm line 16, near "1)"
Compilation failed in require at (eval 31) line 2.
BEGIN failed--compilation aborted at (eval 31) line 2.
BEGIN failed--compilation aborted at (eval 23) line 2.
BEGIN failed--compilation aborted at (eval 10) line 21.
make: *** [extra_build] Error 255
I can't reproduce this. It looks like your IkiWiki.pm is out of sync with your IkiWiki/Plugin/shortcut.pm. The ones distributed in 2.45 are in sync. Or your perl is failing to use the right version of Ikiwiki.pm, perhaps using a previously installed version. But the -Iblib/lib instructs perl to look in that directory first, and the Makefile puts Ikiwiki.pm there. --Joey
I removed all traces of the previous installation, and now 2.45 compiles. I don't know why it was picking up the old version of Ikiwiki.pm, but now it works. Please close this bug, and thanks for the help.
Where were the files from the old installation? I still don't understand why they would be seen, since -Iblib/lib is passed to perl. --Joey
They were under /usr/local/{bin,lib,share}. I can try to provide more info, or try to reproduce it, if you need me to.
Well, here are some things to try.
perl -Iblib/lib -V
This should have blib/lib first in the listed @INC
joey@kodama:~/src/ikiwiki>strace perl -Iblib/lib -e 'use IkiWiki' 2>&1 |grep IkiWiki.pm
stat64("blib/lib/IkiWiki.pmc", 0xbfa1594c) = -1 ENOENT (No such file or directory)
stat64("blib/lib/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31982, ...}) = 0
open("blib/lib/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 5
This is how perl finds IkiWiki.pm here. Note that I've run "make" first.
OK, this is what I'm getting:
$ perl -Iblib/lib -V
@INC:
blib/lib
/usr/lib/perl5/site_perl/5.10.0
/usr/share/perl5/site_perl/5.10.0
/usr/lib/perl5/vendor_perl
/usr/share/perl5/vendor_perl
/usr/share/perl5/vendor_perl
/usr/lib/perl5/core_perl
/usr/share/perl5/core_perl
/usr/lib/perl5/current
/usr/lib/perl5/site_perl/current
I ran the following in my current 2.45 source dir, where the make
already succeded. If you need it, I can post the output
in the case where make
fails.
$ strace perl -Iblib/lib -e 'use IkiWiki' 2>&1 |grep IkiWiki.pm
stat64("blib/lib/IkiWiki.pmc", 0xbfa6167c) = -1 ENOENT (No such file or directory)
stat64("blib/lib/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31901, ...}) = 0
open("blib/lib/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 3
I need to see it in the case where it's failing. --Joey
I finally had some time to look into this again.
I wiped ikiwiki off my system, and then installed version 2.41. I tried installing 2.46 and get the same error as above, so I'll be using 2.46 below. (BTW, the debian page still lists 2.45 as current; I had to fiddle with the download link to get 2.46).
After running ./Makefile.PL
I get:
$ perl -Iblib/lib -V
[bunch of lines snipped]
@INC:
blib/lib
[bunch of paths snipped]
Running the strace:
$ strace perl -Iblib/lib -e 'use IkiWiki' 2>&1 |grep IkiWiki.pm
I get a bunch of ENOENTs and then at the end:
stat64("./IkiWiki.pmc", 0xbfa2fe5c) = -1 ENOENT (No such file or directory)
stat64("./IkiWiki.pm", {st_mode=S_IFREG|0644, st_size=31987, ...}) = 0
open("./IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 3
After running make
(and having it fail as described above):
$ strace perl -Iblib/lib -e 'use IkiWiki' 2>&1 |grep IkiWiki.pm
stat64("blib/lib/IkiWiki.pmc", 0xbfd7999c) = -1 ENOENT (No such file or directory)
stat64("blib/lib/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31901, ...}) = 0
open("blib/lib/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 3
I don't know what is going on, but I'll run any more tests you need me to.
No help.
The only further thing I can think to try isstrace -f
the entire failingmake
run (or the ikiwiki command that's failing in it, if you can reproduce the failure at the command line). --Joey
I have 2.46 installed and I can reproduce the bug reported against 2.49. The command that fails is:
$ /usr/bin/perl -Iblib/lib ikiwiki.out -libdir . -setup docwiki.setup -refresh
docwiki.setup: Failed to load plugin IkiWiki::Plugin::inline: Too many arguments for IkiWiki::htmlize at IkiWiki/Plugin/inline.pm line 359, near "))"
Compilation failed in require at (eval 14) line 2.
BEGIN failed--compilation aborted at (eval 14) line 2.
BEGIN failed--compilation aborted at (eval 10) line 21.
strace -f produces a 112K file. I don't know enough to be comfortable analyzing it. However, lines like:
stat64("/usr/local/share/perl5/site_perl/5.10.0/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31982, ...}) = 0
make me think the make process is not completely independent of a previous installation. Joey, should I email you the strace log file?
Email it (joey@ikiwiki.info), or post it to a website somewhere. --Joey
The relevant part of the file is:
execve("/usr/bin/perl", ["/usr/bin/perl", "-Iblib/lib", "ikiwiki.out", "-libdir", ".", "-setup", "docwiki.setup", "-refresh"], [/* 55 vars */]) = 0
[...]
stat64("blib/lib/5.10.0/i686-linux-thread-multi", 0xbfa72240) = -1 ENOENT (No such file or directory)
stat64("blib/lib/5.10.0", 0xbfa72240) = -1 ENOENT (No such file or directory)
stat64("blib/lib/i686-linux-thread-multi", 0xbfa72240) = -1 ENOENT (No such file or directory)
[...]
stat64("/usr/local/share/perl5/site_perl/5.10.0/IkiWiki.pmc", 0xbfa71e5c) = -1 ENOENT (No such file or directory)
stat64("/usr/local/share/perl5/site_perl/5.10.0/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31982, ...}) = 0
open("/usr/local/share/perl5/site_perl/5.10.0/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 4
So it doesn't look for IkiWiki.pm in blib at all. But it clearly has been asked to look in blib, since it looks for the 3 directories in it. When I run the same thing locally, I get:
execve("/usr/bin/perl", ["/usr/bin/perl", "-Iblib/lib", "ikiwiki.out", "-libdir", ".", "-setup", "docwiki.setup", "-refresh"], [/* 55 vars */]) = 0
[...]
stat64("blib/lib/5.10.0/i486-linux-gnu-thread-multi", 0xbf84f320) = -1 ENOENT (No such file or directory)
stat64("blib/lib/5.10.0", 0xbf84f320) = -1 ENOENT (No such file or directory)
stat64("blib/lib/i486-linux-gnu-thread-multi", 0xbf84f320) = -1 ENOENT (No such file or directory)
[...]
stat64("blib/lib/IkiWiki.pmc", 0xbf84ef4c) = -1 ENOENT (No such file or directory)
stat64("blib/lib/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=32204, ...}) = 0
open("blib/lib/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 6
The thing I really don't understand is why, on the system where perl fails to look in blib when straced as above, we've already established it does look for it when
perl -Iblib/lib -e 'use IkiWiki'
is straced.The only differences between the two calls to perl seem to be: * One runs
perl
, and the other/usr/bin/perl
-- are these really the same program? Doesperl -lblib/lib ikiwiki.out -libdir . -setup docwiki.setup -refresh
fail the same way as the/usr/bin/perl
variant? * The-libdir .
, which causes ikiwiki to modify@INC
, adding "." to the front of it.I'm entirely at a loss as to why I cannot reproduce this with the same versions of perl and ikiwiki as the two people who reported it. There must be something unusual about your systems that we have not figured out yet. --Joey
Joey, thanks for your time and effort looking into this.
I checked with which
: perl
is indeed /usr/bin/perl
. The commands fail similarly when
calling perl
and /usr/bin/perl
.
However, you might be into something with your libdir
idea. If I remove it from the
command line, the command succeeds. In other words, if I run
perl -Iblib/lib ikiwiki.out -setup docwiki.setup -refresh
then it works perfectly.
Well, that's just weird, because
libdir
is handled by code in IkiWiki.pm. So I don't see how setting it could affect its searching for IkiWiki.pm at all, actually. It could only affect its searching for files loaded later. Anyway, can I get a strace of it succeeding this way?Also, can you show me the first 15 lines of your
ikiwiki.out
? It's occurred to me you might have an unusualuse lib
line in it.
By the way, I'm running Arch linux. The perl build script is a bit long, but I see they install a patch to modify @INC: http://repos.archlinux.org/viewvc.cgi/perl/repos/core-i686/perl-5.10.0-archlinux-inc-order.patch?revision=1&view=markup
Would you suggest I try rebuilding perl without this patch? Debian has a huge perl patch (102K!); it's not straightforward for me to see if they do something similar to Arch.
Posted Sun Aug 30 18:59:20 2009I think Debian has a similar patch.
RSS output contains relative links. Ie. http://kitenet.net/~joey/blog/index.rss contains a link to http://kitenet.net/~joey/blog/../blog.html
Posted Sat Aug 29 03:45:22 2009