Content area
Full Text
(ProQuest: ... denotes formulae omitted.)
Originally published as Ashby, W.R. (?968). "Variety, constraint, and the law of requisite variety" in W. Buckley (ed.). Modern Systems Research for the Behavioral Scientist, Chicago, IL: Aldine Publishing Co.
Requisite Variety and the Difference that Makes a Difference: An Introduction to W. Ross AshbyV'Variety, Constraint and Law of Requisite Variety"
Cybernetics, Regulation, and Complex Systems
The first issue of Emergence: Complexity and Organization (Nos. ? and 2, 2004) reprinted, as ourfirst classic paper, W. Ross Ashby's "Principles of Selforganizing Systems" with an introduction by myself. Ashby's paper was selected for several reasons. First, Ashby had long been recognized as one of the most rigorous and original thinkers among the early cyberneticians demonstrated in his tackling challenging areas, his clear argumentation, and his wide knowledge of science, mathematics, and philosophy. Second, Ashby's work did not shirk from going after popular shibboleths creeping like a vine around certain key concepts at the birth of complexity science. Third, Ashby set a high bar for later systems thinkers which it seems not many after him have been capable of reaching.
These reasons stand also for this issue's classic paper but here the hot topic of self-organization is replaced by another of Ashby's major proposals, his Law of Requisite Variety. A close reading of the many references to Ashby's Law since he first proposed it, however, demonstrates either the most tenuous of connections to what Ashby actually wrote or distort his notion altogether. Going back to the original source therefore can help set the record straight.
Like most early cyberneticians, Ashby's work was shaped by information theory, particularly as formulated in the work of Claude Shannon (?948). Ashby's term "variety" was closely linked to Shannon or information entropy, a metric at a maximum in the context of the highest level of uncertainty or unpredictability of a message. An example would be when all the probabilities of each unit of a message code are equally probable, analogous to a fair coin toss where the outcome cannot be predicted better than ? out of 2 times. However, in the case of unequal probabilities, such as tossing an unfair or"loaded" coin, there is less information because of greater predictability. As the noted Hungarian mathematician Alfred Rényi...