Äîêóìåíò âçÿò èç êýøà ïîèñêîâîé ìàøèíû. Àäðåñ îðèãèíàëüíîãî äîêóìåíòà : http://www.chronos.msu.ru/old/EREPORTS/ershov_energy.pdf
Äàòà èçìåíåíèÿ: Sat Dec 14 12:30:34 2013
Äàòà èíäåêñèðîâàíèÿ: Fri Feb 28 10:17:54 2014
Êîäèðîâêà: IBM-866
Energy Characteristics and Kinetics of Information Interactions é Yu. A. Ershov Sechenov Medical Academy, Moscow, Russia Received September 15,1998 Abstract A definition of associated information as a measure of the structural complexity of an object is given: the structure of an object is represented by subsets of its parts and relations, and any measure of complexity is a function mapping the structure of the object. The general properties of this function are formulated in the form of a system of axioms. A formula satisfying these axioms and designed for calculating associated structural information or the complexity of the object is introduced. The obtainment of information is treated as an analytic procedure. For qualitative analysis purposes, the types of object substructures and relations are established. Quantitative analysis is performed by determining the fractions of substructures and relations of various types. On this basis, general relations for calculating the energy and material expenditures for transmission, reception, and storage of information in complex systems are obtained. Examples of applying general relations to various objects are given. Kinetic equations describing the dynamics of information interactions are derived. To determine substance and energy fluxes that carry information, the partial material or partial energy cost of information should be calculated. Formulas for calculating these values are given It is shown that transmitted and associated information are not functions of the energy characteristics of an information carrier; that is, information and thermodynamic entropy are only formally similar to each other. INTRODUCTION In communication theory, some set of transmitted symbols is called a message. According to Shannon [1, 2], the amount of information H in a message is calculated by the generalized Hartley formula H = ks( 1 * log 2 p1 + p2 * log 2 p2 + ... + pn * log 2 pn), (1) p where p1, p2, ..., pn are the probabilities (frequencies) of transmission through a communication channel of one of n possible messages, and kS is the Shannon constant depending on the units selected to measure information. Thermodynamic entropy S is calculated by the Boltzmann formula S = ks( 1 * log w1 + w2 * log w2 + ... + wn * log wn), (2) w where w1, w2,..., wn are the probabilities of thermodynamic system states, and kB is the Boltzmann constant equal to 1.38 *10-23J/K. Equations (1) and (2) are obviously analogous. However, a correct criterion for estimating the relation between the amount of information with accompanying energy expenditures and thermodynamic entropy has not been suggested as yet. To find such a criterion, a constructive definition of information should be given. Here as in [3], we will axiomatically introduce concepts that can be used to construct a heuristic conception. Conclusions from this conception do not contradict experimental data and the known laws of nature. OBJECTS, SYSTEMS, STRUCTURES, AND RELATIONS Both material and abstract objects can be represented as sets U of their parts (fragments, elements) and sets R of relations between parts. Relations ensure wholeness and sameness of an object. The pair of sets U and R is called the structural system or merely structure of the object and is denoted by SS [4]. Structure elements can be material and ideal objects of various natures. In chemistry and physics, chemical elements, substances, unit cells, and phases are treated as 1


object constituents. Accordingly, set U of these parts and set R of relations between them make up a chemical structure. In statistical physics, an object is treated as an ensemble of a large number of interacting molecular subsystems. In linguistics, words and utterances form text structures. The basic difficulty of the theory of systems is the absence of a strict criterion for partitioning objects. In [5], the principle of energy differentiation based on the known scheme of the hierarchical organization of the nature was suggested as such a criterion. In chemical and physical studies, this approach is followed in partitioning objects into phases, substance molecules constituting these phases, and atoms constituting the molecules. In biology, biocenosis is divided into populations, populations are comprised of organisms, organisms represent sets of tissues and cells, and cells are formed from molecules. It follows from the aforesaid that sets of parts U and relations R of structure SS are subdivided into sub systems of some lower level, the dynamics of which is determined by elementary events. Hierarchy of subsystems

Next, higher-level subsystems (strata, layers, and classes) are formed. The elements of each higher-level subsystem are the systems of the preceding level, and elementary events are global (general) processes of the preceding level. The process is continued until system SS of the whole object is reached. The result is the distribution of systems over sublevels written in the form of a system of embedded subsets, Uj Uj+1 Uj+2 ... U SS, (3) where Uj, is the subset of the smallest elements of the SS system, and U is the set of its largest elements. To emphasize the complexity of so-called elementary subsystems, the concept of the unitary structure is introduced. The concept of the unitary structure is equivalent to that of an element of a given level [5]. THE PRINCIPLE OF DIFFERENTIATION BASED ON RELATIONS AND QUASI-SEPARABILITY OF THE STRUCTURE OF AN OBJECT Formalized description of complex objects is considered in [6, 7]. Complex objects are consistently described by the set theory of Russel and Whitehead [8]. Each object is put in correspondence with its type. A set can only comprise objects of the same type, for instance, type j (table). The set itself is then an object of type j + 1 and can at the same time be an element of a set of type j+2, etc. In this theory, sets of unitary structures make up a hierarchy of types. In conformity with the aforesaid, the structure of an object is defined as set R of relations defined on set U of the unitary structures of this object. The object as a unique entity is determined by the type of the elements of set U and the properties of set relations R, in particular, by the energy characteristics of relations between elements. For this reason, a hierarchy of element types can naturally be constructed with the use of differentiation according to energy of relations as an empirical criterion. Let the initial object consist of unitary structures between which both strong and weak interactions can arise (for instance, let it be a mixture of different atoms). In some time, strong 2



interactions are exhausted with the formation of superstructures j + 1 (table) as combinations of the initial unitary structures j (in chemistry, various molecules are formed from atoms). Next, weaker interactions can cause aggregation of superstructures j + 1 into superstructures j + 2 of a higher level (molecules associates). As a result, an object with a hierarchical structure is formed. Actions on a material object cause decomposition. Usually, weak bonds dissociate first to produce aggregates. Next, stronger bonds undergo splitting to cause the decomposition into the initial unitary structures of a minimum size (atoms). By analogy with this real procedure, objects with unitary structures of different types can be mentally decomposed (quasi-decomposed). The principle of differentiation according to bond energies is formulated as follows: the strength of bonds between unitary structures of a given level (strong bonds) far exceeds the strength of bonds between the corresponding superstructures of the same type (weak bonds). Formally, this differentiation principle can be written as Ej >> Ej+1, (4) where Ej, is the bond energy between the elements of level (substructure) j, Ej+1 is the bond energy between the elements of a higher (j + 1) level (superstructure, see table). For relation (4) to be used, it is necessary to find a measure of bond strength between elements. Bond is interaction or relation between objects by means of energy, substance, or information exchange. Accordingly, we distinguish between energy, material, and information relations. The strength of a static bond is measured by the energy of its dissociation. The strength of an exchange bond between objects is measured by substance and energy fluxes from one object to the other. A particular case of exchange interactions is information interactions. Note that substance, energy, and information fluxes are conjugated with each other. COMPLEXITY OF AN OBJECT AND ASSOCIATED INFORMATION Along with the Shannon definition of information, many others are employed in practice [9]. Equation (1) can conveniently be used to estimate transmitted information but is inapplicable to calculations of stored or associated information. The state of an object is determined by a set of structural data comprised of a subset of object parts and relations and a subset of the properties of these parts and relations. The motion of an object is represented by changes in its state in the form of the time dependence of the set of data. Formally, the state of an object can be described by a set of (q, t) functions, where q are state variables, and t is time. Coding the (q, t) functions in a given language and their transmission is the essence of information interaction. Information about an object is determined by a set of terms (signs, symbols, or signals) mapping the state of the object in a given language and stored on one or another carrier. The structure of an object is mapped by the subset of its parts and relations. Associated or structural information about an object is determined by a set of terms mapping the subset of its parts and relations with the required accuracy [3]. According to this definition, associated information is a measure of the structural complexity of the object. It is intuitively clear that the complexity of systems increases with the number of unitary structures (cardinality of set R) and the number of classes (types) of elements equivalent in some respect. For instance, chemistry and physics treat any material object as consisting of particles of only three types, namely, protons, neutrons, and electrons. These particles bound. Axiom 2 (A2). The complexity of class with each other in various combinations by nuclear and electric forces form atoms of more than 100 chemical elements. The same particles combined to form carbon, hydrogen, and oxygen by chemical bonds form organic substance molecules. Various methods for measuring complexity were suggested [10]. Any measure of complexity is a function, or a map, of the structure of the object. The general properties of this function can be formulated as seven axioms. 3
á á


Axiom 0 (A0). A set of unitary structures U of an arbitrary object can be divided as described above into subsets Uj of equivalent unitary structures called adjacent classes, strata, or parts. Axiom 1 (Al). The complexity of the empty set equals zeroUj of unitary structures cannot be greater than the complexity of set U of all unitary structures of the object incorporating this class. Axiom 3 (A3). If there is a single-valued correspondence (homomorphism) between the unitary structures of set Ui and the unitary structures of set Uj, the complexity of Ui, does not exceed that of Uj . Axiom 4 (A4). If there is a one-to-one correspondence between the unitary structures of sets Ui, and Uj, the complexities of these sets are equal. Axiom 5 (A5). The complexity of an object comprising several unrelated parts that do not have common unitary structures equals the sum of the complexities of these parts. Axiom 6 (A6). The complexity of the minimum element equals zero. Axiom 0 includes quasi-separability of an object and the definition of the structure as a union of unitary structures. Axioms 1 and 2 imply nonnegativity of the measure of structural complexity. According to A2 and A3, the measure of complexity should be monotonic, and according to A4, complexity does not change as a result of the introduction of new denotations (indices) for some arbitrary unitary structures without varying the other denotations. Axiom A5 determines the complexity of an object comprising independent parts, and A6 introduces a natural unit of complexity for expressing complexity in numbers. Axiom 3 is illustrated by the postulate of identical systems, according to which similarity and dissimilarity of objects are determined by similarity and dissimilarity of both parts and relations forming the structure of the objects. One of its corollaries is the widely used analog simulation. Note that there is no one-to-one structure-property correspondence; that is, the equality of the functions does not imply that the structures are identical. For instance, compare birds and airplanes or fish and whales. By analogy with (1), associated structural information or object complexity S can be written as C(S) = -( Here, is the frequency of unitary structures of type k determined by the equality = Nk/N0, where N0 is the total number of unitary structures in object S, and Nk is the number of unitary structures of type k (N1, ..., Nn are the numbers of unitary structures of types 1,2, ..., n). The number and types of relations in set R of the structure do not appear in (5). Indirectly, these characteristics are taken into account by the numbers and types of unitary structures. It can easily be checked that the complexity calculated by (5) satisfies the requirements of axioms 1-6. In spite of the formal similarity of expressions (2) for the entropy and (5) for information, they are basically different. In (2), Boltzmann constant kB determines the entropy as an observable value; that is, the total reduced heat obtained by the object in a quasi-equilibrium process of the formation of the state under consideration. Dimensionless information calculated by (5) is determined by the number of bits necessary for describing the set of the properties of the object in a given state. It follows from (5) that all objects comprising N identical elements contain equal amounts of associated information or have a complexity equal to log2N, which is an analogue of the Boltzmann formula in statistical physics. It is clear intuitively that the associated information of a biological community exceeds that of an organism entering this community, and the associated information of an organism exceeds that of its chemical composition. Calculations by (5), however, give values close in the order of magnitude. The contradiction is removed if it is taken into account that both objects and the fragments into which the objects are separated have different complexities and belong to different types (table). For instance, biogenic substances consist of different molecules having different atomic compositions. Organs of an animal are made 4

#

) (' &

%$ # è çæ å äã â # "!

è çæ å äã â é

é è è çæ å çã â è é á è çæ å äã â á

#


of different tissues comprising different molecules differently linked with each other. Biological communities comprise populations of different species, and populations comprise different specimens, and so on down to molecules. The complexity of an object can be taken into account if it is represented in the form of hierarchy of smaller unitary structures (table). For this purpose, equation (5) is augmented by additional terms to obtain the generalized expression for the associated information of object S C(S) = -( where , , and are the frequencies of unitary structures of types k, kl, and klm equal to = Nk/N0, = Nkl/Nk, and = N /N , where N0 is the total number of the larger unitary structures of object S, and Nk, Nkl, and Nklm are the numbers of smaller unitary structures of types k, kl, and klm into which each unitary structure of the preceding (higher) level is separated (table). Each sum in (6) determines the associated structural information of some layer of partitioning of set U of the elements of the object. Of interest is an increase in complexity accompanying structure detailing (axiom 2). Associated information is a function of the state of equilibrium or quasi-equilibrium objects. Such structures are sustained without energy expenditures, and calculations of the strength of static bonds are conducted within the framework of thermodynamics. The energy characteristics of nonequilibrium objects and transmitted information are the subject matter of kinetics. KINETICS OF INFORMATION PROCESSES An information chain is a system providing transmission of information from an object to a receiver [11]; various communication systems [2] are information chains. An information flux is the value determined by the rate of transmitting information through an information chain; transmission of information between chain units is effected by information contacts. An information contact is a system of two elements, the source of information and the receiver (monitor) for recording the incoming information. Clearly, the shortest information chain comprises two units, a source of information and a monitor connected by an information connection (transmitted information). The reception of information can be treated as an analytic procedure. Qualitative analysis is performed by establishing the types of the unitary structures and relations constituting the object. Quantitative analysis implies the determination of the fractions of unitary structures and relations of various types. A chemist performs qualitative analysis by determining chemical elements, molecules, and phases constituting a body. An ecologist studies biological specimens, species, and populations constituting a community and the chemical composition of the environment. Quantitative analysis of the same objects is performed by measuring the numbers of different unitary structures found by qualitative analysis techniques. Obtaining information requires the use of an observation channel, that is a device and a procedure for recording terms (signs and signals reflecting the structure of an object and its state) [1, 2, 4]. Technically, observation channels represent devices of different complexity. Sensory organs are natural observation channels of animals. Supplying information to information chains of any complexity begins with a primary information contact; this is accompanied by exchange of energy and substance between the source of information and the receiver (monitor) with arising of exchange coupling. As a result, the associated information of the source is displayed on the monitor. From the point of view of exchange coupling, taking fingerprints and location of a surface relief of a planet by a radio beam from an aircraft are procedures of the same type. In the first case, the structure of the skin of a finger is mapped on a paper, and in the second, the structure of the surface is mapped on a screen of a visual monitor and in memory of the onboard computer. As a result of an information

5

)# (' &

%$ # " ã éè ç éå ä " ã !

ã " ã " ã " ã ã éè ç éå ä ã âá ã éè ç æå ä ã âá

ã ã ã

ã


contact, the information about the structure of the source of information is mapped as associated information of the receiver (monitor). A detailed study of various information contacts leads to the concept of transmitted information as mapping of the associated information of the object (source) onto the structure of another object (monitor). Transmitted information is something different from associated information. The internal energy of a body and the energy transferred from a body to other bodies are similarly related. But this is only a limited analogy. Energy transfer obeys the law of conservation of energy, and the internal energy of a body decreases by an amount strictly equal to the sum of the work produced and the heat transferred. The associated information of a source also decreases as a result of interaction, however, by an amount depending on the degree to which a contact with a monitor affects the structure of the source of information. Noninvasive methods and introscopic observation channels have virtually no effect on the structure of the object under study. At the same time, there exist many analytic methods that completely destroy the sample and cause the loss of associated information. In the process where is the energy of information interaction. It is important that, during any information contact, not only does the source act on the monitor, but also the monitor acts on the source. This is especially manifest in studies of microscopic objects with the use of macroscopic devices; this is also embodied in the uncertainty principle. As a consequence, transmitted information cannot exceed the associated information of an object, that is, I(S) C(S), (8) where C(S) is the associated information of object S, and I(S) is the transmitted information about the object. The amount of transmitted information depends on the structures of the object and the monitor and the energy and the substance carrying the information. During an information contact, there arises a composition of the structures of the information source and the monitor, which is the essence of the transmitted information. This composition manifests itself most clearly when an object is mapped onto a screen. The shape and size of the object (object geometry) are then superposed on a net of elements. Various structures are similarly mapped onto the net of a cellular automaton [6]. Formally, transmission of information by means of an information contact can be described with the use of a mathematical frame construction [12]. The amount of information transmitted from object S to a monitor is calculated by the formula I(S) = -( Here, , and are the frequencies of unitary structures k, kl, and klm, respectively, based on transmitted information: = N k /N 0 , = N kl/N k, and = N /N where N 0 is the total number of the larger unitary structures of object S; and N , N , and N are the numbers of smaller unitary structures of types k, kl, and klm, respectively, determined with the use of an observation channel (see above). Equation (9) is similar to (6) in its form, but the frequencies and the numbers of various unitary structures are labelled by superscript t (transinformation). A decrease in transmitted information in comparison with associated information [inequality (8)] depends on the extent to which the structures and characteristics of the source and the observation channel match each other; these characteristics include detection limits, selectivity, resolution, and reliability [13]. The necessary condition for obtaining information about a varying object is matching of the rate of object variations to the rate of information transmission. From the information point of view [7], any change in the object is a change in the set of unitary structures at some hierarchy level (table). A change in the set of unitary structures of an object is called the general process of the object. 6



1 3& 3& 3& # # æ '& 1 3& 1 3& # 1 '& # '& # # 54 1 3& # " ! 2 1 '& % 0$ )( $ '& # " ! ã '& % $ # " ! ã

é6 ç

éè ç

æ å äá ãâ á â

á


A reaction replacing atoms in reagent molecules is the general process in a chemical reactor. Growth of an organism is the general process of the appearance of new and the removal of old cells. Population changes caused by birth and death of living organisms is the general process for populations. Axiom 7 (A7). The essence of any object and its sameness do not change while and so far as the numbers and the frequencies of unitary structures at the higher levels remain constant (Nj+1 and Nj+2, see table), and only the numbers of less complex unitary structures (Nj, table) change. Comparatively simple equations describing the dynamics of complex objects are obtained with the use of aggregated characteristics; that is, values that describe the object as a whole [14,15]. In the theory of systems and thermodynamics, these are functions of state. Aggregated variables appear in kinetic equations describing evolution of these variables and, therefore, the object as a whole. In these equations, the numbers of unitary structures Nk, Nkl, and Nklm are functions of time. Accordingly, associated C(S) and transmitted I(S) information represent aggregated variables describing the behavior of the object in time. The kinetic equations describing the dynamics of information are obtained by the differentiation of (6) and (9). The kinetics of associated information C(t) is described by the equation dC/dt = -0.71( (1 + lnfk)dfk/dt + (1 + lnfkl)dfkl/dt + (1 + lnfklm)dfklm/dt), (10) where dC/dt is the rate at which the structure of the object changes; fk, fkl, and fklm are the frequencies of unitary structures of types k, kl, and klm, respectively [see equation (9)]; dfk/dt, dfkl/dt, and dfklm/dt are the rates of changes in these frequencies. The kinetics of transmitted information, I(t), is described by the equation dC/dt = -0.71( (1 + lnfdk)dfdk/dt + (1 + lnfdkl)dfdkl/dt + (1 + lnfdklm)dfdklm/dt), (11) Equation (11) for the transmitted information has the same form as (10), but all frequencies appear in (11) with superscript d [formula (9)]. Axiom 7 facilitates monitoring of complex objects. The frequencies of the higher unitary structures determining object sameness do not change, and the rates of their variations equal zero. To obtain information about the state of the object, it therefore suffices to monitor the general process. For instance, if Nk and Nkl do not change, equation (10) takes the form dC/dt = -0.71 (1 + lnfklm)dfklm/dt), (12) where fklm are the frequencies of the unitary structures at the third lower level (table) whose rates of changes dfklm/dt describe the dynamics of the general process. For instance, when monitoring the state of an animal, control can be confined to the digestive, circulatory, and nervous systems. An animal is vital while the three specified systems and the parts and tissues of these systems remain intact. Tissues themselves can, within certain bounds, change in the course of time, that is, the general process occurs at the fourth level counting from above. Aggregated characteristics, namely, the weight fractions of the corresponding tissues, can conveniently be used as frequencies fklm. These values can be determined, for instance, by X-ray examination. The rates of weight fraction variations dffklm/dt for various tissues are calculated from the results of periodic observations. The substitution into (12) of values obtained in this way gives a kinetic equation describing vital activity of the organism. ENERGY AND MATERIAL COST OF INFORMATION According t information from information, we information given o (11), the rate of changes the source to the monitor must calculate the partial by the formulas Km= m/ KE= E/ in the structure of an object determines the flux of . To find fluxes of substance and energy carrying material (Km) or the partial energy (KE) cost of (13) (14) 7

,





æâ á åä ãâ á







á

á






where m and are the mass and the energy consumed in transmitting information . Substance and energy expenditures for transmission of information are calculated by the formulas m = Km E = KE Partial information costs Km and KE are functions of the design of the information contact and the procedure for transmitting information. For this reason, the partial cost of information is a nonthermodynamic value, and, therefore, the relation between information and ther-modynamic entropy is, if it exists, indirect. Formally, this relation can be represented, for instance, as follows. It can be assumed, in conformity with the thermo-dynamic definition, that the increase in the entropy of the receiver equals = /T, where T is the temperature of the receiver. Equation (14a) then relates entropy to information as E = KE , (15) where KE is the coefficient (unknown) for estimating thermodynamic entropy changes for the amount of transmitted information. Note that the cost of information determined by (13) and (14) is independent of the units used to measure the amount of information. It is only important that the measure of information satisfy axioms 1-6 and be a single-valued function. If the structure is known, its description is equivalent to some text from the point of view of information theory. Text words are the symbols denoting the elements of the structure. In the considered example of a hierarchical structure (table), these are Uk, Ukl, and Uklm corresponding to levels j+2, j + 1, and j, respectively. The minimum requirements for coding the elements are: three letters (digits) for the level j + 2, five letters for j + 1, and seven letters for y. Information needed for a complete description of three levels given in bytes can be calculated as = 3 k + 5 k +...+ 7 klm, (16) The energy cost of information is E = KE [equation (14a)], where KE is the cost of transmitting one byte. The lower bound of the energy conjugated with the transmitted information is determined by the amplitude of thermal fluctuations ET of the source and the monitor. This amplitude is of the order of ET = kBT, where T is the temperature. The upper bound of the energy of the information flux is determined by a minimum energy ED of disruption of the weakest connections between the source of information and the monitor. This means that the thermodynamic entropy of transmission of information can vary in the range kB ED/T. At the same time, the transmitted information can remain virtually unchanged [equation (9)]. This emphasizes the formal similarity between information and thermodynamic entropy. In complex objects, separate information contacts make up information chains with feedback loops, cycles, and nets; these chains can be represented in the form of graphs. Given below is the graph of the control system of the higher organisms [3,16]:

Arrows denote information contacts kl between components k and /, CNC is the central nervous system control, BC is the behavioral control, AC is the autonomic nervous system control, HC is the humoral control, MC is the metabolic control, R is the result, and RR is the reception of the result. Information for each contact kl is calculated by (12). Next, (13a) and (14a) are used to determine material ( m) and energy ( E) expenditures. The control system has a hierarchical structure. The elementary control systems of separate cells are connected and obey the control systems of tissues and organs. The latter are controlled by the nervous and humoral systems of the organism as a whole. Each system with feedback

8

äã

34 3



é éè ç é éè ç å2 1



äã

åã æ åã å2 1

$# "! äã



å2 1

âá











0 ) (ç äã



äã

âá

'& %




forms a closed circuit. The material carriers of controlling signals of living control systems are the products of metabolism of these systems; that is, information fluxes are conjugated with metabolic fluxes. The energy characteristics and the kinetics of metabolic fluxes have been studied in detail, which makes it possible to calculate the material and energy cost of information interactions in various biological systems, from cells to ecological systems [17]. Equations (13) and (14) can therefore be used to calculate expenditures for information support of vital activity of organisms. For instance, calculations show that ~30% of energy supplied by food is consumed to support nervous activity of humans. The behavior of natural control systems is characterized by properties common to all technical information systems, and the procedure described above can be applied to objects with a high organization, so called organismic technical systems [15,18,19]. REFERENCES 1. Shannon, C.E., Raboty po teorii informatsii i kibemetike (Works on the Theory of Information and Cybernetics), Dobrushina, R.L. and Lupanova, O.B., Eds., Moscow: Inostrannaya Literatura, 1963. 2. Shannon, C.E., Bell Syst. Tech. J., 1948, vol. 27, no. 3, p. 379; no.4,p.623. 3. Ershov, Yu.A., Problemy infovzaimodeistviya (Problems of Information Interaction), Novosibirsk: Nauka, 1993, pp. 178-205. 4. Klir, G.J., Architecture of System Problem Solving, New York: Plenum, 1985. 5. Ershov, Yu.A., Termodinamika kvaziravnovesii v biolo-gicheskikh sistemakh (Thermodynamics of Quasi-Equi-libria in Biological Systems), Moscow: VINITI, 1983. 6. Levich, A.P, Teoriya mnozhestv, yazyk teorii kategorii i ikh primenenie v teoreticheskoi biologii (Set Theory, Language of the Theory of Categories, and Their Application to Theoretical Biology), Moscow: Mosk. Gos. Univ., 1982. 7. Levich, A.P., Sistemnye issledovaniya (System Analysis), Moscow: Nauka, 1989, pp. 304-325. 8. Whitehead, A.N., Essays in Science and Philosophy, New York: Greenwood, 1968. Translated under the title Izbrannye raboty po filosofii, Moscow, 1990. 9. Gubanov, V.A., Zakharov, V.V., and Kovalenko, N.N., Vvedenie v sistemnyi analiz (Introduction to System Analysis), Leningrad: Leningr. Gos. Univ., 1988. 10. Rosen, R., Fundamentals of Measurement and Representation of Natural Systems, New York: North-Holland, 1978. 11. Kochergin, A.N. and Kogan, V.Z., Problemy informa-tsionnogo vzaimodeistviya v obshchestve (Problems of Information Interaction in Community), Moscow: Nauka, 1980. 12. Shreider, Yu.A. and Sharov, A.A., Sistemy i modeli (Systems and Models), Moscow: Nauka, 1982. 13. Harmuth, H.F., Information Theory Applied to Space-Time Physics, Singapore: World Scientific, 1992. Translated under the title Primenenie metodov teorii informatsii vfizike, Moscow: Nauka, 1989. 14. Moiseev, N.N., Aleksandrov, V.V., and Tarko, A.M., Chelovek i biosfera (Men and Biosphere), Moscow: Nauka, 1985. 15. Moiseev, N.N., Matematicheskie zadachi sistemnogo analiza (Mathematical Problems of System Analysis), Moscow: Nauka, 1981. 16. FunktsionaVnye sistemy organizma (Functional Systems of Organism), Sudakov, K.V, Ed., Moscow: Meditsina, 1987.

9


17. Ershov, Yu.A. and Mashkamborov, N.N., Kinetika i ter-modinamika biokhimicheskikh i fiziologicheskikh pro-tsessov (Kinetics and Thermodynamics of Biochemical and Physiological Processes), Moscow, 1990. 18. Rashevsky, N., Organismic Sets: Some Reflections on the Nature of Life and Society, Holland, Michigan Math. Biol., 1972. 19. Novosel'tsev, V.N., Organizm v mire tekhniki: kibeme-ticheskii aspekt (Organism in the World of Technology: Cybernetic Aspect), Moscow: Nauka, 1989.

10