Comments:


CoCreatr     Thu, Jul 15, 2010  Permanent link
Not really my definition, but one that fits, albeit from a completely differing context.
(Source link below quote.)

... I think we can surmise that the campfire helped us speak and so it helped us become conscious. Something like this happened about 100,000 – 60,000 years ago. For suddenly our tool development, art and technology took off. All the foundations of our world today were discovered in a 10,000 year period. Tools had been the same for a million years. Within a 1,000 years they were completely different. We invented pottery. We invented metallurgy. The wheel. Everything we depend on was discovered then. Not only discovered but widely disseminated in a short period of time.

How did this occur?

My bet is that it happened because of the social process created by the campfire and by our hunter gatherer culture of equality. Such an environment extracts order from chaos. Design from intuition. It is ideal for the exploration of implicit knowledge. It is ideal for discovering things that we don’t know exist. It is ideal for taking half baked ideas and refining them. Let’s use a thought experiment. ...


Source: Have books been bad for us? by Rob Paterson
shiftctrlesc     Thu, Jul 15, 2010  Permanent link
Thanks CoCreatr ... after days of swimming in facts, I'm very grateful to be reminded that metaphors are often more valuable than definitions.
notthisbody     Fri, Jul 16, 2010  Permanent link
@vincent_olivier's answers, from twitter:

Any workable P2P system RELIES on "superpeers". That is, absolute P2P is impossible in both theory AND practice. Eat that you CI terrorist.

Ergo, real P2P as a political, economical or social framework is a dystopic mathematical fallacy. QED.

(beat)

P2P to me is the maximum effort to avoid hardcoding stratification mechanisms in social architecture/infrastructure.


perhaps these are a couple of different definitions. i leave them open for interpretation and lack of context :)
BenRayfield     Sat, Jul 24, 2010  Permanent link
The peer/superpeer software designs, which are being called peer-to-peer, are for getting around censoring of the internet, a type of censoring that was created by accident but has continued to exist because it works against peer-to-peer softwares. It started because we were running out of IP4 internet addresses, but most of the internet continues to use IP4 (instead of upgrading to IP6 which has been available for 10 years) because governments and internet service providers (ISPs) like it that way. For 75 dollars/month, I'm renting a computer and internet service for it that is not censored in that way, but most people are not going to do that. The difference between a "server" and a "normal computer" is the same as the difference between a "superpeer" and a "peer". I have internet service, but I have to buy extra service to not be censored. This censoring prevents peer-to-peer softwares from working unless they use a peer/superpeer design, and "peer" is not really peer, but "superpeer" is what we normally mean by peer. The difference between a peer and a superpeer is only if the internet service is censored by Network Address Translation (NAT) or not. The problem is getting worse. I said many months ago that this was obviously being done to centralize control of the internet, and later I heard that the president of USA has an off switch for the internet in USA. Theres a few times, like if a smarter than Human virus infected the internet, that such an off switch would be good to have, but I'm more worried that politics would influence or threaten or force the off switch to be used for things that do not serve the people and instead serve the government. The biggest problem I have with such an "off switch" is it has been planned in secret for years in an undemocratic way, because they knew we would react badly to it, so they decided to do it without telling us little by little for years. Those are the main issues involving peer-to-peer software. Anyone can pay extra money to be a "superpeer", and superpeer is the only kind of internet service that can really do peer-to-peer. Normal peers (not superpeers) can only connect to superpeers but not connect to other peers, so they're not really peer-to-peer. For "peers" to communicate in any way with other "peers", it has to go through 1 or more "superpeers", which contradicts the idea of them being peer-to-peer. Superpeers have no such restrictions and can communicate to any other superpeer that chooses to listen. Peers only have 1 option: communicate to superpeer. Superpeers can not contact them. The first communication has to be from the peer because peers can not receive communications, only start them. Superpeer to superpeer is the only real p2p, and it threatens the authority of Internet Service Providers (ISPs) so they continue their subtle hardware designs to indirectly cause such censoring. Please do something about the censoring/hacking called http://wikipedia.org/wiki/Network_address_translation so every computer can have the communication abilities of a server/superpeer. Would you accept the censoring of certain words or ideas if you could pay 75 dollars/month extra for internet service that was not censored? If not, then why should we accept this anti-peer-to-peer censoring which costs extra to avoid? Why should we accept any censoring for any price? How many people know its censored at all? Most people think "this p2p program is broken" instead of "this p2p program is censored by my ISP". That's why people accept censoring, because most of them do not know it exists.
notthisbody     Tue, Aug 31, 2010  Permanent link
from Pierre Levy's IEML (information economy meta-language) Vision Document:

in order to take best advantage of the unprecedented possibilities made available by cyberspace for the manipulation of symbols, we needed an intellectual technology that hypertextually links all possible concepts within a calculable network - yet without granting any particular privilege to any of them. In other words, we needed to extend the form “P2P” (which, although not common knowledge at the time, was nonetheless implicit in the structure of the Internet and hypertexts) to include the relationships between concepts. In order to retain this neutrality and equality of design, the generative motor for the new digitally-based thought
instrument could be nothing other than the logical analysis of meaning itself. That way, no concept could be excluded or marginalized.