- keimform.de - https://keimform.de -

Required or facilitated reciprocity?

Peers support each otherThe following post is based on two [1] mails [2] I wrote as part of a discussion on the jox mailing list [3] (a relict of the short-lived [4] [DE] CSPP journal) at the end of March. I try to explain why I have changed my position compared to the suggestions formulated in the book From Exchange to Contributions [5], but also why the change is not as radical as some people seem to think.

While in my book I describe what could be characterized as “open sharing communities requiring reciprocity” (you are required to contribute in order to benefit), my more recent work [6] is about “open sharing communities facilitating reciprocity” – where contributing in some ways is easy and encouraged, but it is not required in order to benefit. When we look at existing successful peer communities, we see that they tend to follow the latter model, hence the change.

The old model was more concerned about lifting the protestant work ethic [7] into a context of peers (as opposed to the buyer/seller relation of capitalism). Work is considered a necessary evil that people will only do if required (forced) to do so, hence contributions must be required or they won’t be done. But actually, successful peer production is more about extending and generalizing the hacker ethic [8] – turning work into something that is fun, pleasurable and rewarding in itself. If you manage to do this, requiring contributions or giving additional external rewards is no longer required and indeed often harmful (the crowding-out effect [9] described by Benkler and others).

Michel Bauwens [10] inquired if it’s not more reasonable to have an “integrative approach” which goes from “one special case, the need for reciprocity, to another, that doesn’t require it.” But actually, I never said anything against reciprocity per se. Reciprocity exists in either case, both in my old approach of “requiring reciprocity” as well as in the newer approach of “facilitating reciprocity.” I do something for others, and others do something for me. Indeed, that’s true of any society.

Also, it’s noteworthy that in both models, reciprocity is indirect – I do something for the community (= other community members) and the community (= other community members) does something for me. But the community members that do something for me will generally not be those that I do something for. In my old “task auctioning” model this indirect reciprocity was enforced and measured – I had to give back (in general) the same amount of labor that was needed to produce the goods I consume.

Society-wide, that relation will always hold – only the goods that have been produced can be consumed. But meanwhile I think it no longer necessary to enforce this on an individual level. If you stop considering consumption as “the good” (that everybody wants to increase as much as possible) and production as “the bad” (that everybody wants to avoid), instead considering both as necessary, interwoven, and potentially pleasant aspects of life (as the hacker ethic does), then enforcing something becomes much less important. After all, you wouldn’t force people to consume, so why force them to produce?

That still leaves the question of how to minimize possible mismatches between consumptive and productive desires. I think that stigmergy [11], automation and re-organization are the best responses here:

  1. Announce the tasks necessary for your consumptive goals, and wait for volunteers.
  2. If there aren’t enough, try to automatize the task, i.e. let machines do it. Getting there will usually need other tasks, so go back to step 1 for them. (I think it will often be easier to find volunteers to automatize something than do it manually.)
  3. If automation is not (or only partially) possible and there still aren’t enough volunteers, think about how to re-organize the task in such a way that it becomes more attractive for potential volunteers. (Indeed, potential volunteers will do this themselves and might decide to re-organize tasks in ways you didn’t foresee.)

After all this, a pool of (apparently quite unpleasant) tasks which nobody wants to do might remain. For these, I would first consider voluntary distribution among the community members, where (more or less) everybody does a small part of them now and then, without something very bad happening if you don’t. (Though you could get some bad looks or nagging from other community members, if community expectation of doing these tasks is high and you refuse. Hence the line between voluntary and enforced can be quite blurred.) If that doesn’t work, i.e. if too many people opt out, the community would doubtlessly agree on more formal sanctions, such as restricting the consumptive options of those who refuse. In this case, the task pool would revert to my old model of required reciprocity.

I certainly won’t rule out that that can happen, so in that sense I haven’t “given up” my old model. I just don’t consider it the best, or even the most likely scenario. Lets see how far we get with solutions that follow the hacker ethic, or “peer spirit” – stigmergic, self-organized, voluntary. We can figure the rest out later if and when needed.

[Continued by: Why not just pay someone when needed? [12]]