[7][8] Statistical mechanics arose out of the development of classical thermodynamics, a field for which it was successful in explaining macroscopic physical properties—such as temperature, pressure, and heat capacity—in terms of microscopic parameters that fluctuate about average values and are characterized by probability distributions.[citation needed] In 1738, Swiss physicist and mathematician Daniel Bernoulli published Hydrodynamica which laid the basis for the kinetic theory of gases.In this work, Bernoulli posited the argument, still used to this day, that gases consist of great numbers of molecules moving in all directions, that their impact on a surface causes the gas pressure that we feel, and that what we experience as heat is simply the kinetic energy of their motion.[9] The founding of the field of statistical mechanics is generally credited to three physicists: In 1859, after reading a paper on the diffusion of molecules by Rudolf Clausius, Scottish physicist James Clerk Maxwell formulated the Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range.[13] Boltzmann's original papers on the statistical interpretation of thermodynamics, the H-theorem, transport theory, thermal equilibrium, the equation of state of gases, and similar subjects, occupy about 2,000 pages in the proceedings of the Vienna Academy and other societies.There is however a disconnect between these laws and everyday life experiences, as we do not find it necessary (nor even theoretically possible) to know exactly at a microscopic level the simultaneous positions and velocities of each molecule while carrying out processes at the human scale (for example, when performing a chemical reaction).In quantum statistical mechanics, the ensemble is a probability distribution over pure states and can be compactly summarized as a density matrix.[9][20][21] For example, recent studies shows that the theory of statistical mechanics can be built without the equal a priori probability postulate.[22] The Gibbs theorem about equivalence of ensembles[23] was developed into the theory of concentration of measure phenomenon,[24] which has applications in many areas of science, from functional analysis to methods of artificial intelligence and big data technology.In order to make headway in modelling irreversible processes, it is necessary to consider additional factors besides probability and reversible mechanics.Non-equilibrium mechanics is therefore an active area of theoretical research as the range of validity of these additional assumptions continues to be explored.The Boltzmann transport equation and related approaches are important tools in non-equilibrium statistical mechanics due to their extreme simplicity.These approximations work well in systems where the "interesting" information is immediately (after just one collision) scrambled up into subtle correlations, which essentially restricts them to rarefied gases.A few of the theoretical tools used to make this connection include: An advanced approach uses a combination of stochastic methods and linear response theory.Ensembles are also used in: Statistical physics explains and quantitatively describes superconductivity, superfluidity, turbulence, collective phenomena in solids and plasma, and the structural features of liquid.[citation needed] Analytical and computational techniques derived from statistical physics of disordered systems, can be extended to large-scale problems, including machine learning, e.g., to analyze the weight space of deep neural networks.