The biggest advantage of a wiki is the ease of linking between documents.
The second biggest (and distinguishing) advantage of a wiki is the ability for everyone to edit.
The first advantage, linking, is not the sole domain of a wiki. But many people (myself included) use wikis specifically for this purpose. The ability to create (and track) documents through links is the principal power of the web as a whole, but I can think of few applications outside of wikis that make this easy. I maintain the principal feature of a CMS is the ability to link pages as well, and it is probably the most poorly implemented, as well as the biggest lock-in.
So what makes up this feature? I think it consists of two main parts: The ability to assume part of a URL, and the ability to tie the page title to that part of the URL that needs specified.
One important feature of a wiki for me is the ability to build a hierarchy of pages, such as tools:testing:bug-tracking:mantis (or tools\testing\bug-tracking\mantis.) This is namespacing in code, and in wiki’s it is valuable too. Using composition, you can also include an element in a different hierarchy (namespace) as well, such as projects:open-source:php:mantis. It is a powerful organizational feature.
A CMS that had a good sitemap generation tool would be advantageous, particularly if pages could then be composed of components in different hierarchies. By sitemap generation, I mean the ability to create hierarchies of pages as objects that can interact (through links) and be called by aliases or alternate routes.
I mention routes, because I think of Rails’ “routes” mechanism for mapping pages (or servlet-mapping, for java afficionados). The other feature of wiki URL, and CMSes, and Rails that people are drawn to is pretty URLs, or rather, descriptive URLs.
The other feature of wikis that people like, as I mentioned is the ability for everyone to edit. It’s also the most trepidatious. Wikipedia is the hallmark of success for this feature, but a large part of their effort is in fighting corruption, spam, and bias. It’s the reason something else hasn’t taken it’s place. For every wikipedia, there are a million overrun, neglected, or out of date wikis (like my own.)
Permissions, captchas, and administrators are the answer to this. But most wikis die of organizational failure. The ability to edit pages (and link mapping), and the ability to structure them in hierarchies is critical to the success of a wiki.
The choice to restrict access is a critical one, and paradoxically, is more important (and more harmful) at the point before which a wiki draws the userbase to be self-sustaining.
The question then is, how do you build up the user base to reach a self-sustaining management level in a wiki without it getting overrun or disorganized. Better spam prevention, approval workfly, and organizational editing may be the answer.
I’d be curious to know what that point is. I’d guess it would be around 100 dedicated or 1000 casual users. For a restricted environment (such as an intranet or members only wiki) that number might be as low as 10 (or possible exists in a ratio of perhaps 1:3 dedicated to casual users), where organization and relevance being the battles needing fought, rather than spam. The ratio may be the key, where spammers would count as some number of casual users. Clearly also, the administrative tools affect (and should be aimed at lowering) this ratio.