Coarse-grained modeling

They are usually dedicated to computational modeling of specific molecules: proteins,[1][2] nucleic acids,[3][4] lipid membranes,[2][5] carbohydrates[6] or water.By decreasing the degrees of freedom much longer simulation times can be studied at the expense of molecular detail.Coarse-grained models have found practical applications in molecular dynamics simulations.[8][10] The coarse-grained modeling originates from work by Michael Levitt and Ariel Warshel in 1970s.[1] Atomistic resolution models alone are presently not efficient enough to handle large system sizes and simulation timescales.cannot be attributed to an arbitrarily microscopic particle since this does not radiate thermally like a macroscopic or "black body"., here in one spatial dimension) remains constant in the course of time, no matter where the pointIn order to relate this view to macroscopic physics one surrounds each pointe.g. with a sphere of some fixed volume - a procedure called coarse graining which lumps together points or states of similar behaviour.A large number of such systems, i.e. the one under consideration together with many copies, is called an ensemble.Each replica system appears with the same probability, and temperature does not enter.In the case of equilibrium or steady motion the equation of continuity implies that the probability densityis (obtained as above) and is described as fine grained or von Neumann entropy.The microcanonical ensemble is again a large number of noninteracting copies of the given system andNow consider interaction of a given system with another one - or in ensemble terminology - the given system and the large number of replicas all immersed in a big one called a heat bath characterised byThe interaction of the ensemble elements via the heat bath leads to temperatureNormalisation then implies Then in terms of ensemble averages or by comparison with the second law of thermodynamics.Frequently one considers a different situation, i.e. two systems A and B with a small hole in the partition between them.Suppose B is originally empty but A contains an explosive device which fills A instantaneously with photons.Hence originally both are in pure quantum states and have zero fine grained entropies.Since A is filled with photons, these obey a Planck distribution law and hence the coarse grained thermal entropy of A is nonzero (recall: lots of configurations of the photons in A, lots of states with one maximal), although the fine grained quantum mechanical entropy is still zero (same energy state), as also that of B.Now allow photons to leak slowly (i.e. with no disturbance of the equilibrium) from A to B.With fewer photons in A, its coarse grained entropy diminishes but that of B increases.This entanglement of A and B implies they are now quantum mechanically in mixed states, and so their fine grained entropies are no longer zero.Finally when all photons are in B, the coarse grained entropy of A as well as its fine grained entropy vanish and A is again in a pure state but with new energy.On the other hand B now has an increased thermal entropy, but since the entanglement is over it is quantum mechanically again in a pure state, its ground state, and that has zero fine grained von Neumann entropy.Its coarse grained entropy, however, rose from zero to its final nonzero value.Roughly half way through the procedure the entanglement entropy of B reaches a maximum and then decreases to zero at the end.The classical coarse grained thermal entropy of the second law of thermodynamics is not the same as the (mostly smaller) quantum mechanical fine grained entropy.As may be deduced from the foregoing arguments, this difference is roughly zero before the entanglement entropy (which is the same for A and B) attains its maximum.
complex systemsmolecular modelingamino acid residuemolecular dynamicsKinesinMichael LevittAriel Warshelmultiscale modelingblack bodyLiouville equationinformationBrownian motionLAMMPSMcCabe CBibcode