Contents:
The Theory of a general quantum system interacting with a linear dissipative system - Feynman, R. Feynman, R. Quantum Mechanics and Path Integrals - McGraw-Hill New York. Statistical Mechanics - Benjamin Reading. Schulman, L.. Techniques and Applications of Path Integration - Quantum theory of a free particle interacting with a linearly dissipative environment - Hakim, Vincent et al. A32 Dynamics of the dissipative two-state system - Leggett, A. Scholl, R.. Diploma Thesis - Schramm, P..
Thesis - Seeger, A..
Vacancies and Interstitials in Metals - Quantum Aspects of Molecular Motion in Solids - Heidelberg: Springer. Oppenheim, I.. Stochastic Processes in Chemical Physics - MIT Cambridge, Mass. The mean-field theory of nuclear structure and dynamics - Negele, John W.
Brownian motion of a quantum oscillator - Schwinger, Julian S. Hemmer, P.
Exact results for a damped quantum-mechanical harmonic oscillator - Riseborough, Peter S. A31 Ludwig, G.. An operational approach to quantum probability - Davies, E. Berne, B. Physical Chemistry,Vol.
Forster, D.. Hayes, W.. Scattering of Light by Crystals - Statistical properties of quantum systems: The linear oscillator - Lindenberg, Katja et al. A30 Irreversibility and generalized noise - Callen, Herbert B. Coherent and incoherent states of the radiation field - Glauber, Roy J.
Talkner, P.. Equivalence classes of minimum uncertainty packets - Stoler, David Phys. D1 Equivalence classes of minimum-uncertainty packets.
D4 Some considerations on Carnot's principle and its connection with MaxEnt, as a general principle of reasoning, has been advanced by Jaynes []. He described the evolution of Carnot's principle, via Kelvin's perception that it defines a universal temperature scale, Clausius' discovery that it implied the existence of the entropy function, Gibbs' perception of its logical status, and Boltzmann's interpretation of entropy in terms of phase volume, into the general formalism of statistical mechanics. Fluctuations and Maxwell-like relations. As already shown, the average value of any dynamical quantity P j G of the basic set in MaxEnt-NESEF the classical mechanical level of description is used for simplicity is given by.
Moreover, from a straight calculation it follows that.
The diagonal elements of are the mean square deviations, or fluctuations, of quantities P j G , namely. Let us next scale the informational entropy and Lagrange multipliers in terms of Boltzmann constant, k B , that is, we introduce. Moreover, the fluctuation of the IST-informational-entropy is given by.
Equation has the likeness of an uncertainty principle connecting the variables j and j t , which are thermodynamically conjugated in the sense of Eqs. This leads to the possibility to relate the results of IST with the idea of complementarity between the microscopic and macroscopic descriptions of many-body systems advanced by Rosenfeld and Prigogine [50, ]; this is discussed elsewhere []. Care must be exercised in referring to fluctuations of the intensive variables F j.
In the statistical description fluctuations are associated to the specific variables j , but the F 's are Lagrange multipliers fixed by the average values of the P 's, and so D 2 is not a proper fluctuation of but a second order deviation interpreted as being a result of the fluctuations of the variables on which it depends, in a generalization of the usual results in statistical mechanics in equilibrium []. These brief considerations point to the desirability to develop a complete theory of fluctuations in the context of MaxEnt-NESEF; one relevant application of it would be the study of the kinetics of transition between dissipative structures in complex systems, of which is presently available a phenomenological approach [99].
According to the results of the previous subsection, quite similarly to the case of equilibrium it follows that the quotient between the root mean square of a given quantity and its average value is of the order of the reciprocal of the square root of the number of particles, that is. Consequently, again quite in analogy with the case of equilibrium, the number of states contributing for the quantity P j to have the given average value, is overwhelmingly enormous. Therefore, we can write that. We recall that this is an approximate result, with an error of the order of the reciprocal of the square root of the number of degrees of freedom of the system, and therefore exact only in the thermodynamic limit.
Equation represents the equivalent in IST of Boltzmann expression for the thermodynamic entropy in terms of the logarithm of the number of complexions compatible with the macroscopic constraints imposed on the system. The expression of Eq. In terms of these results we can look again at the -theorem of subsection 4. Evidently, Eq. With elapsing time, as pointed out by Bogoliubov, subsets of correlations die down in the case of photoinjected plasma implies the situation of increasing processes of internal thermalization, nullification decay of fluxes, etc.
In IST this corresponds to a diminishing informational space, meaning of course a diminishing information, and, therefore, a situation less constrained with the consequent increase of the extension of and increase in informational entropy. Citing Jaynes, it is this property of the entropy - measuring our degree of information about the microstate, which is conveyed by data on the macroscopic thermodynamic variables - that made information theory such a powerful tool in showing us how to generalize Gibbs' equilibrium ensembles to nonequilibrium ones.
The generalization could never have been found by those who thought that entropy was, like energy, a physical property of the microstate []. Also following Jaynes, W t measures the degree of control of the experimenter over the microstate, when the only parameters the experimenter can manipulate are the usual macroscopic ones.
Because phase volume is conserved in the micro-dynamical evolution, it is a fundamental requirement on any reproducible process that the phase volume W t compatible with the final state cannot be less than the phase volume W t o which describes our ability to reproduce the initial state []. On this we stress the point that to derive the behavior of the macroscopic state of the system from partial knowledge has been already present in the original work of Gibbs.
This is at the roots of the well established, fully accepted, and exceedingly successful statistical mechanics in equilibrium: the statistical distribution which should depend on all constants of motion is built, in any of the canonical ensembles, in terms of the available information we do have, namely, the preparation of the sample in the given experimental conditions in equilibrium with a given and quite reduced set of reservoirs.
Werner Heisenberg wrote [], "Gibbs was the first to introduce a physical concept which can only be applied to an object when our knowledge of the object is incomplete". Returning to the question of the Bayesian approach in statistical mechanics, Sklar [4] has summarized that Jaynes firstly suggested that equilibrium statistical mechanics can be viewed as a special case of the general program of systematic inductive reasoning, and that, from this point of view, the probability distributions introduced into statistical mechanics have their bases not so much in an empirical investigation of occurrences in the world, but, instead in a general procedure for determining appropriate a priori subjective probabilities in a systematic way.
Also, Jaynes' prescription was to choose the probability distribution which maximizes the statistical entropy now thought in the information-theoretic vein relative to the known macroscopic constraints, using the standard measure over the phase space to characterize the space of possibilities.
Uchiyama and F. According to Jaynes, the question of what are theoretically valid, and pragmatically useful, ways of applying probability theory in science has been approached by Sir Harold Jeffreys [36,37], in the sense that he stated the general philosophy of what scientific inference is and proceeded to carry both the mathematical theory and its implementations. But we recall that it is a Lagrange multiplier that the method introduces from the outset being a functional of the basic set of macrovariables. Therefore, the contribution t to the full statistical operator, that is, the one describing the dissipative evolution of the state of the system, to be clearly evidenced in the resulting kinetic theory, clearly indicates that it has been introduced a fading memory process. It is the most important part.
This probability assignment is a generalization of the probabilities determined by the Principle of Indifference in Logic specifying one's rational choice of a priori probabilities. In equilibrium this is connected with ergodic theory, as known from classical textbooks. Of course it is implied to accept the justification of identifying averages with measured quantities using the time in the interval of duration of the experiment.
This cannot be extended to nonequilibrium conditions involving ultrafast relaxation processes. Therefore, there remains the explanatory question: Why do our probabilistic assumptions work so well in giving us equilibrium values? The Bayesian approach attempts an answer which, apparently, works quite well in equilibrium, and then it is tempting to extend it to nonequilibrium conditions. Jaynes rationale for it is, again, that the choice of probabilities, being determined by a Principle of Indifference, should represent maximum uncertainty relative to our knowledge as exhausted by our knowledged of the macroscopic constraints with which we start [11].
This has been described in previous sections. At this point, it can be raised the question that in the study of certain physico-chemical systems we may face difficulties when handling situations involving fractal-like structures, correlations spatial and temporal with some type of scaling, turbulent or chaotic motion, finite size nanometer scale systems with eventually a low number of degrees of freedom, etc. These difficulties consist, as a rule, in that the researcher is unable to satisfy Fisher's Criterion of Sufficiency [] in the conventional, well established, physically and logically sound Boltzmann-Gibbs statistics, meaning an impairment to include the relevant and proper characterization of the system.