Content-Length: 181504 | pFad | https://www.academia.edu/82150538/Extreme_Physical_Information_EPI_Response_to_Criticism
Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
4 pages
1 file
Roy Frieden has used Fisher information as a grounding principle for deriving and elaborating physical theory. (Examples are the Schrödinger wave equation of quantum mechanics, and the Maxwell-Boltzmann distribution of statistical mechanics.) Such theories take the form of differential equations or probability distribution functions. Central to Frieden's derivations is the mathematical variational principle of extreme physical information (EPI). This principle builds on the well-known idea that the observation of a "source" phenomenon is never completely accurate. That is, information is inevitably lost in transit from source to observation. Furthermore, the random errors that creep in are presumed to define the distribution function of the source phenomenon. As Frieden puts it, "the physics lies in the fluctuations." Finally, the information loss may be shown to be an extreme value. Thus if the observed level of Fisher information in the data has value I, and the level of Fisher information that existed at the source has value J, the EPI principle states that I − J = extremum. In most situations, the extremum is a minimum, meaning that there is a tendency for any observation to faithfully match up with its source.
2014
The unknown amplitude law q(x) defining an observed effect may be found using the principle of Extreme Physical Information. EPI is derived as follows. The observations follow an information flow J --> I, with J the information intrinsic to the source and I the Fisher information level in its data, obeying (i) I=4 Integral dx q' -squared. Here q'= dq/dx and p(x) = q(x)-squared is the probability. It was previously shown, using L. Hardy's 5 axioms defining physics, that I = max. Therefore, its variation (ii) delta I = 0. Note that I is generic, obeying (i) for all source effects, whereas J is specific to the particular effect. Hence, rather than having form (i), J obeys (iii) J = Integral dx j[q(x),s(x)] with j some function of its arguments and s(x) a known source, such as of mass, biological fitness, etc. Information I decreases under any irreversible operation such as measurement, so that I l.e. J or, equivalently, I = kJ where 0 l.e. k l.e. 1. Then the variation delta I = k delta J so that property (ii) gives (iv) delta J = 0 as well. Then combining (ii) and (iv), delta(I - J) = 0. Or, I - J = L = extremum. What kind of extremum? Eqs. (i) and (iii) give (v) L=4q'^2 - j[q(x),s(x)]. Differentiating (v), (d^2 L )/(dq'^2) = +8. Then by the Legendre condition the extremum is a minimum. The unknown source effect obeys (vi) I - J= minimum, EPI.
Physical Review A, 1990
Consider an isolated statistical system specified by a coordinate x and its probability density p(x). A functional of p(x) called "Fisher information" can be used to measure the degree of disorder of the system due to the spread in p(x). Fisher information may be minimized, subject to a physical constraint, to attain a temporal equilibrium solution p(x). When the constraint is linear in the mean kinetic energy of the system, the equilibrium solution p (x) often obeys the correct dN'erential equation for the system. In this way, the Schrodinger (energy) wave equation, Klein-Gordon equation, Helmholtz wave equation, di8'usion equation, Boltzmann law, and Maxwell-Boltzmann law may be derived from one classical principle of disorder. The convergence rate for Fisher information is about that for alternative use of maximum entropy (in problems where both have the same equilibrium solution)~This suggests that Fisher information defines an arrow of time. The arrow points in the direction of decreasing accuracy for the determination of the mean, or ideal, value of a parameter.
Physical review. E, Statistical, nonlinear, and soft matter physics, 2013
Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the D...
Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 2005
We review with a tutorial scope the information theory foundations of quantum statistical physics. Only a small proportion of the variables that characterize a system at the microscopic scale can be controlled, for both practical and theoretical reasons, and a probabilistic description involving the observers is required. The criterion of maximum von Neumann entropy is then used for making reasonable inferences. It means that no spurious information is introduced besides the known data. Its outcomes can be given a direct justification based on the principle of indifference of Laplace. We introduce the concept of relevant entropy associated with some set of relevant variables; it characterizes the information that is missing at the microscopic level when only these variables are known. For equilibrium problems, the relevant variables are the conserved ones, and the Second Law is recovered as a second step of the inference process. For non-equilibrium problems, the increase of the relevant entropy expresses an irretrievable loss of information from the relevant variables towards the irrelevant ones. Two examples illustrate the flexibility of the choice of relevant variables and the multiplicity of the associated entropies: the thermodynamic entropy (satisfying the Clausius-Duhem inequality) and the Boltzmann entropy (satisfying the H-theorem). The identification of entropy with missing information is also supported by the paradox of Maxwell's demon. Spin-echo experiments show that irreversibility itself is not an absolute concept: use of hidden information may overcome the arrow of time.
Physical Review, 1957
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum.entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether
Physica A: Statistical Mechanics and its Applications, 1992
Maxwell's equations of classical electrodynamics may be derived on the following statistical basis. Consider a gedanken experiment whereby the mean space-time coordinate for photons in an electromagnetic field is to be determined by observation of one photon's space-time coordinate. An efficient (i.e. optimum) estimate obeys a condition of minimum Fisher information, or minimum precision, according to the second law of thermodynamics. The Fisher information I is a simple functional of the probability law governing space-time coordinates of the "particles" of the field. This probability law is modeled as the source-free Poynting energy flow density, i.e., the ordinary local intensity in the optical sense, or, the square of the four-vector potential. When the Fisher information is extremized subject to an additive constraint term in the total interaction energy, Maxwell's equations result.
Foundations of Science, 2011
During the refereeing procedure of Anthropomorphic Quantum Darwinism by Thomas Durt, it became apparent in the dialogue between him and me that the definition of information in Physics is something about which not all authors agreed. This text aims at describing the concepts associated to information that are accepted as the standard in the Physics world community.
Physical review, 1957
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum.entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether
Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Chamonix, France, 2010, 2010
At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox’s approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.
XV Congress of Christian Archaeology, Toledo, Spain, September 2008
El Mostrador, 2024
درع الوطن, 2017
ΑΕΜΘ 8,, 1994
International Journal of Innovative Science and Research Technology, 2024
Pakistan Journal of Criminology, 2018
IEE Proceedings - Control Theory and Applications, 2002
Formação, Prática e Pesquisa em Educação, 2019
European Urology Supplements, 2015
Journal of Education, Health and Sport, 2020
سیزدهمین کنگره ملّی مهندسی عمران, 2022
BMC primary care, 2024
Cognitive, Affective, & Behavioral Neuroscience, 2013
Fetched URL: https://www.academia.edu/82150538/Extreme_Physical_Information_EPI_Response_to_Criticism
Alternative Proxies: