Next Article in Journal
State Transfer via On-Line State Estimation and Lyapunov-Based Feedback Control for a N-Qubit System
Previous Article in Journal
Efficiencies and Work Losses for Cycles Interacting with Reservoirs of Apparent Negative Temperatures
Previous Article in Special Issue
Information Theory and an Entropic Approach to an Analysis of Fiscal Inequality
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropy of Simulated Liquids Using Multiscale Cell Correlation

by
Hafiz Saqib Ali
1,2,
Jonathan Higham
1,2,† and
Richard H. Henchman
1,2,*
1
Manchester Institute of Biotechnology, The University of Manchester, 131 Princess Street, Manchester M1 7DN, UK
2
School of Chemistry, The University of Manchester, Oxford Road, Manchester M13 9PL, UK
*
Author to whom correspondence should be addressed.
Human Genetics Unit, Institute of Genetics & Molecular Medicine, The University of Edinburgh, Western General Hospital, Crewe Road South, Edinburgh EH4 2XU, UK.
Entropy 2019, 21(8), 750; https://doi.org/10.3390/e21080750
Submission received: 29 June 2019 / Revised: 22 July 2019 / Accepted: 28 July 2019 / Published: 31 July 2019

Abstract

:
Accurately calculating the entropy of liquids is an important goal, given that many processes take place in the liquid phase. Of almost equal importance is understanding the values obtained. However, there are few methods that can calculate the entropy of such systems, and fewer still to make sense of the values obtained. We present our multiscale cell correlation (MCC) method to calculate the entropy of liquids from molecular dynamics simulations. The method uses forces and torques at the molecule and united-atom levels and probability distributions of molecular coordinations and conformations. The main differences with previous work are the consistent treatment of the mean-field cell approximation to the approriate degrees of freedom, the separation of the force and torque covariance matrices, and the inclusion of conformation correlation for molecules with multiple dihedrals. MCC is applied to a broader set of 56 important industrial liquids modeled using the Generalized AMBER Force Field (GAFF) and Optimized Potentials for Liquid Simulations (OPLS) force fields with 1.14*CM1A charges. Unsigned errors versus experimental entropies are 8.7 J K 1 mol 1 for GAFF and 9.8 J K 1 mol 1 for OPLS. This is significantly better than the 2-Phase Thermodynamics method for the subset of molecules in common, which is the only other method that has been applied to such systems. MCC makes clear why the entropy has the value it does by providing a decomposition in terms of translational and rotational vibrational entropy and topographical entropy at the molecular and united-atom levels.

Graphical Abstract

1. Introduction

Molecular liquids are present in numerous systems in chemistry and biology. However, methods to calculate their entropy are scarce or limited in scope. Entropy quantifies the probability distribution of quantum states of a system and, together with energy, determines a system’s stability. The most common route used to determine entropy is indirect, being as a difference with respect to a reference state, typically the ideal gas or a non-interacting set of atoms. The entropy difference may be extracted from integrated heat capacity changes or from the Gibbs energy difference, either as its temperature derivative or as a difference with enthalpy [1]. While there is a range of methods to compute entropy [2,3,4,5,6,7,8,9,10], those that use single molecular dynamics or Monte Carlo simulations are advantageous because of the ease of using standard simulation methods and because such approaches directly yield and explain entropy and structure in terms of the full probability distribution of the system of interest. However, because the ensemble of molecular configurations generated by standard simulation methods is only a tiny fraction of the full ensemble corresponding to a system’s entropy, special techniques are required to extrapolate to the full probability distribution and entropy.
The probability distributions to evaluate entropy are typically over the coordinates of the system, which may be Cartesian coordinates, bonds-angles-dihedrals, or interatomic distances. Histogram-based methods, because of their arbitrary bin-widths, can only give the entropy difference relative to a reference, which is typically the uniform distribution. Even then, the entropy difference may be unrealistic for strongly interacting systems such as those with covalent bonds because of the omission of quantum effects which necessarily keep the entropy non-negative. For this reason, histogram methods are often restricted to softer degrees of freedom such as dihedrals or atomic distances. The simplest approach ignores coordinate correlations by considering each coordinate separately, for example, in dihedral angles [11]. Higher-order correlations can be included such as the radial distribution function [12,13,14] or a mutual-information expansion [15,16] but at greater computational expense and complexity, even for second-order, although some correlations are small and can be excluded [17,18,19]. Extensions to higher orders are difficult and do not necessarily lead to more accuracy [15,20]. Mutual information in terms of discrete rotamers has been found to converge much faster, enabling up to eighth order [21]. An alternative strategy for high-dimensional data sets is the k-Nearest Neighbours method [16,22,23,24] which more adaptively estimates density from the distances between configurations but at the price of having many distances to compute and still requiring a lot of data to converge.
Significant simplification of the theory, greater speed of convergence and a route to the direct calculation of entropy is provided by assuming a multivariate Gaussian probability distribution [25]. Entropy is directly computed from the quantum states of the set of harmonic-oscillator eigenvectors [26,27]. The main limitation of the method is the suitability of the Gaussian distribution, given that typical potential energy surfaces for flexible molecules [28] or liquids [27,29] have multiple minima, compounded by the difficulty of how to specify the minima. A hybrid solution to this problem is to replace the diagonal elements of the coordinate covariance matrix with the entropy of the probability distributions [30,31]. Another solution is to incorporate multiple Gaussians [32]. An approach particularly relevant to the case of liquids is the 2-Phase Thermodynamics (2PT) method, which calculates entropy from the spectrum of vibrational frequencies derived from the velocity auto-correlation function and the gas-phase fluidicity [33]. Another viable method for liquid-phase entropy is the cell approximation which maps regions of the potential energy surface into single, representative energy wells, whose entropy is determined from the force [34] plus an entropy term for the probability distribution of the energy wells [35]. This is the method we have been working to generalise, progressing from liquid argon [34] to liquid water with its rotational vibration and orientational degrees of freedom [35,36,37], organic liquids with an internal one-dimensional dihedral entropy [38], single molecules with internal entropy based on force correlation [39], and molecular liquids in a multiscale framework from atom to united atom to molecule to system [40]. This development has been supported by extensive parallel studies on the entropy of aqueous solutions [41,42,43,44,45,46,47]. With the main ideas now in place to make the method general, to encapsulate the main features of the method we name it Multiscale Cell Correlation (MCC).
Here we extend MCC to calculate the entropy of 56 important industrial liquids. These represent a class of system which no other method has been capable of calculating entropy except for the 2PT method, which has been tested on a smaller subset of 14 liquids [48], argon [33], water [49], carbon dioxide [50], and methanol and hexane including torsional fluidicity [51]. The first improvement here in MCC is a more appropriate application of the mean-field cell approximation to the weakly correlated non-bonded and dihedral degrees of freedom and not to the correlated bonded and angular degrees of freedom as had been done in previous work [39,40]. Strong correlations for the bonded atoms invalidate the cell approximation and can be accounted for in the force covariance matrices. Related to this, force and torque covariances are evaluated separately because of their weak correlation [40]. The second key improvement is a new way to account for correlation between dihedrals by using a covariance matrix of conformation correlation, a method that scales with the square of the number of dihedrals. The 56 liquids are tested using two force fields: OPLS (Optimized Potentials for Liquid Simulations) with 1.14*CM1A charges [52] and GAFF [53] (Generalized AMBER Force Field), for both of which parameters can be generated in an automated fashion for a wide range of molecules. A decomposition of the entropy in six terms gives an insightful and intuitive explanation of why molecules have the entropy they do. Compared to our earlier study in which a comparison with 2PT was inconclusive because there were few liquids in common, MCC is found to be significantly closer to experiment than 2PT, which in most cases underestimates experiment. An analysis of entropy components suggests that the internal entropy of 2PT is responsible for this underestimation, even when torsional fluidicity is included [51]. The findings show that MCC is well placed to scale to complex multi-component systems with multiple length scales.

2. Theory

2.1. Entropy Decomposition

The entropy of molecular liquids is well captured at two different length scales [40]: the molecule (M) level and the united-atom (UA) level. A united atom is defined here as a non-hydrogen atom together with any bonded hydrogen atoms and is taken as a rigid body with both translational and rotational degrees of freedom rather than only translation as for a point-particle unless there are no hydrogens. Such an approach captures softer collective dihedral motion of hydrogens while ignoring their individual stretching and bending motions which have negligible entropy, owing to the low mass of hydrogen and its higher bond and angle vibrational frequencies. At the other extreme of the whole system, the entropy of its three translational and three rotational degrees of freedom is negligible on a per-molecule basis. Coordinate systems at the molecule and united-atom levels are defined as before [40]. For a molecule this is its three principal axes with the origin at the centre of mass. For a united atom the axes and centre of mass depend on the number of bonded united and hydrogen atoms. All non-linear molecules and united atoms have three translational degrees of freedom. Linear molecules in terms of their united atoms or linear united atoms in terms of their hydrogens have two rotational degrees of freedom. United atoms with no hydrogens have no rotational degrees of freedom.
In the cell approximation the potential energy surface is partitioned into energy wells, and in the multiscale approximation this partitioning is done at the molecule and united-atom levels. This brings about two kinds of entropy term: vibrational relating to the average size of the energy wells, termed a cell, and topographical relating to the probability of the energy wells. The vibrational term at each level is further partitioned according to the translational (transvib) and rotational (rovib) degrees of freedom. The translational component of the topographical entropy at the molecular level is zero for a pure liquid because exchanging identical molecules leads to no change. The rotational topographical entropy (topo) at the molecule level is termed the orientational entropy. At the united-atom level the translational topographical entropy is the conformational entropy, while the rotational component, corresponding to hydrogen-bond arrangements, is negligible for the liquids studied here. The total entropy per molecule for a liquid is therefore taken as the sum of six terms
S total = S M transvib + S M rovib + S M topo + S UA transvib + S UA rovib + S UA topo

2.2. Molecular Vibrational Entropy

All four vibrational entropy terms are calculated in the harmonic approximation using the equation for a collection of N vib quantum harmonic oscillators
S vib = k B i = 1 N vib h ν i / k B T e h ν i / k B T 1 ln 1 e h ν i / k B T
where k B is Boltzmann’s constant, h is Planck’s constant, T is temperature and ν i are the vibrational frequencies. Different to previous work [40], translational and rotational vibrational entropy are evaluated separately, justified by the absence of correlations between the forces and torques that are used to evaluate them. For S M transvib , N vib = 3 and ν i are calculated using [39,40]
ν i = 1 2 π λ i k B T
where λ i are the eigenvalues of the 3 × 3 mass-weighted force covariance matrix of the molecule with elements F i F j , with i and j ranging over the three axes x , y , z and averaging over all molecules in all simulation frames. Mean-field, mass-weighted forces are defined as F i = F i / 2 m where m is the molecule’s mass, and F i is half the component of net force on all the atoms of the molecule rotated into the molecule’s coordinate frame. In practice, this matrix is essentially diagonal because forces along different axes are negligibly correlated. The halving is done in the mean-field cell approximation [34,35] whereby every pairwise energy term and therefore its negative coordinate derivative, the force, is partitioned equally between the atoms involved. The mean-field cell approximation is justified in liquids because average molecular energies and forces in many-body systems are weakly correlated with the position of any other neighbouring molecule. Only over the short duration of a repulsive collision is the correlation significant. To calculate S M rovib with Equation (2), N vib = 3 unless the molecular is linear with respect to its united atoms, in which case N vib = 2 . The vibrational frequencies ν i are calculated using Equation (3) with eigenvalues from the N vib × N vib moment-of-inertia-weighted torque covariance matrix of the molecule, whose elements are τ i τ j , where τ i = τ i / 2 I i for each axis i = x , y , z and I i is the respective moment of inertia, with torque halving being done as for the forces.

2.3. United-Atom Vibrational Entropy

The procedure at the united-atom level to evaluate S UA transvib and S UA rovib in Equation (1) is similar to that at the molecule-level but with some differences. United atoms are used in place of molecules to evaluate the forces, torques, masses and moments of inertia. N vib in Equation (2) for united-atom translation equals 3 N 6 , where N is the number of united atoms and the six vibrations removed correspond to the six largest eigenvalues which are already accounted for as molecular translation and rotation. N vib in Equation (2) for united-atom rotation depends on the number of non-linear, linear and point united atoms, as well as the linearity of the whole molecule. Non-linear and linear united atoms contribute 3 and 2 degrees of freedom, respectively, and the largest six or five eigenvalues are removed if the molecule is non-linear or linear. A notable difference compared to the molecule level is that the mean-field cell approximation is not made for bonded atoms or bonded 1–3 interactions corresponding to angles. The forces of such atoms are strongly correlated, a correlation that is accounted for in the covariance matrix. However, the mean-field approximation is still made for united-atom rotation and dihedral vibration whose correlations with neighbours are weak relative to the overall torque or force or which largely average out to zero because of averaging in different reference frames. Consequently, forces in the united-atom matrix are not halved but united-atom torques are halved. To implement the cell approximation for dihedrals, the N dih lowest eigenvalues of the united-atom force covariance matrix are halved twice (force-squared), where N dih is the number of united-atom dihedrals, because these eigenvalues correspond to the soft conformational eigenvectors.

2.4. Molecular Topographical Entropy

The molecular topographical entropy S M topo in Equation (1) only has a rotational contribution for a pure liquid, referred to as the orientational entropy. Based on the idea that neighbouring molecules discretize a molecule’s rotational motion, S M topo is estimated using an average of the number of orientations weighted by the probability p ( N c ) of molecular coordination number N c using [40]
S M topo = k B N c p ( N c ) ln max 1 , ( N c 3 π ) 1 / 2 / σ
where σ is the symmetry number of the molecule according to its united atoms. The max function only takes effect for the very small values of N c which are rare. Thus there are N c 1 / 2 orientations per rotational axis, and every orientation is taken to have the same probability, σ / ( N c 3 π ) 1 / 2 , justified by the weak correlation of these moderately polar molecules with their neighbours. For linear molecules with two axes of rotation [40], the equation is
S M topo = k B N c p ( N c ) ln max 1 , N c / σ
Molecules with a single united atom may still have orientational entropy at the atom-level if their hydrogens sufficiently break symmetry, so as to form distinct energy wells. Ammonia is included in this category, as water had been earlier [40], but methane and hydrogen sulfide are not. N c is evaluated using the parameter-free relative angular distance (RAD) method [54,55] according to the centre of mass of each molecule. RAD determines N c from a single configuration in good agreement with those using a cut-off at the first minimum in the radial distribution function. It avoids the need for a mean-field, spherically-symmetric cut-off that must either be chosen arbitrarily or evaluated from the pre-computed radial distribution function.

2.5. United-Atom Topographical Entropy

The topographical entropy at the united-atom level, S UA topo , also called the conformational entropy, is derived from the distribution of discrete conformations for all flexible dihedrals involving united atoms. Unlike in the previous work on liquid entropy [40] in which the molecules only had a maximum of one flexible dihedral, a number of molecules here have multiple dihedrals. Given that they may be correlated, we present a new method to account for this using a conformation correlation matrix. Each molecule has N dih dihedrals, taken as four consecutive, bonded united atoms. The topographial entropy of dihedrals at the atomic level and involving hydrogen are ignored, either because they have only one conformation by symmetry, such as a methyl group, or because they have negligibly more than one conformation, such as a hydroxyl, owing to limited variable hydrogen-bonding capability to neighbour molecules. The molecules considered here have three conformations per dihedral: trans (t), gauche− ( g ) and gauche+ ( g + ), defined with boundaries in dihedral angle at 120 , 0 and −120 , respectively. Thus each molecule has available 3 N dih conformations. Every combination of conformations for each molecule is termed a conformer, and the total possible number of conformers is 3 N dih . Overall we ensure there is no double-counting of identical conformers by treating g and g + as distinct and dividing by the rotational symmetry number in Equations (4) and (5). We construct the 3 N dih × 3 N dih correlation matrix ρ which has elements
ρ i j = p i j r i j / P m ( i )
where p i j is the probability of simultaneously having the conformation pair i and j, normalised such that i = 3 m 3 m + 2 j = 3 n 3 n + 2 p i j = 1 for the square sub-block over all conformation i and j of the respective dihedrals m and n, and r i j is the Pearson correlation coefficient of conformations i and j, given by
r i j = p i j p i p j ( p i p i 2 ) ( p j p j 2 ) 1 / 2
where p i is the probability of conformation i. Note that r i i = 1 , p i i = p i , and p i j = 0 if i and j belong to the same dihedral. P m ( i ) in Equation (6) are normalisation constants, one per dihedral m, that are defined to ensure i = 3 m 3 m + 2 j = 1 3 N dih ρ i j = 1 for each dihedral m. Thus ρ i j represents the fraction of correlation that conformation pair i j makes to the total correlation that i’s dihedral m has with all conformations of all dihedrals. Similar to the von Neumann entropy of the density matrix [56], the total conformational entropy is given by
S UA topo = k B i = 1 3 N dih λ i ln ( λ i )
where λ i , the eigenvalues of ρ , are the probability of each conformer eigenvector, and each conformer eigenvector itself comprises the probabilities of each conformation. Unlike the density matrix, whose trace equals 1 [56], the trace of ρ ranges from 1, corresponding to full correlation between conformations, to a maximum of N dih , corresponding to fully uncorrelated conformations. For a molecule with uncorrelated conformations or with only one dihedral [40], its eigenvalues would be the diagonal elements of ρ and the conformer eigenvectors would be the individual conformations. At the other extreme of full correlation, as would occur when there is only one single conformer, one eigenvalue would equal 1 with its eigenvector being that very conformer, while the remaining eigenvalues would be zero. For cases of intermediate correlation between conformations, the eigenvector conformers would have various contributions from the correlated conformations, with entropy ranging from zero to the fully disordered value for all N dih dihedrals.

2.6. Molecular Dynamics Simulations

The entropy was calculated for a series of 56 liquids using molecular dynamics simulations. All simulations were carried out with the sander module of the AMBER 14 simulation package [57]. Each system consists of 500 identical molecules in the liquid phase in a cubic box. The force fields used were GAFF [53] with AM1-BCC charges for all molecules and OPLS-AA with the 1.14*CM1A charges [52] for all molecules except acetonitrile, carbon dioxide, hydrogen sulfide and tetrafluoroethylene for which charges were not available on the LigParGen webserver [52]. In place of this for carbon dioxide, a simulation was run with the TraPPE (Transferable Potentials for Phase Equilibria) force field [58]. All molecules were built in standard geometry using xleap of AMBER 14. GAFF force-field parameters were generated with antechamber [59] and all molecules were placed in a cubic box of side 6 nm using Packmol [60]. For OPLS, the GROMACS topology and coordinate files were obtained by uploading a pdb of each molecule to the LigParGen webserver [52] with the 1.14*CM1A charges, the coordinates of the box of molecules were generated in GROMACS 5.1 [61], and the topology and coordinate files were converted into AMBER format using the AMBER ParmEd tool. Note that these OPLS charges differ to those in previous work [40] with the OPLS force field which had charges fitted to liquid-phase properties [62]. TraPPE parameters for carbon dioxide were added directly in by hand.
For equilibration, each system was minimized with 500 steps of steepest descent minimization, thermalized in a 100 ps molecular dynamics simulation at constant volume and temperature using a Langevin thermostat with a collision frequency of 5 ps 1 , and brought to the correct density with 1 ns of molecular dynamics simulation at constant pressure using the Berendsen barostat with a time constant of 2 ps. For data collection, forces and coordinates were saved every 1 ps in a further 1 ns simulation under the same conditions, which earlier work had shown to be easily sufficient for converged values [40], in which as few as ten frames was often sufficient to achieve converged integer values in units of J K 1 mol 1 . The pressure was 1 bar and the temperature was 298 K unless the liquid was gaseous at that temperature, in which case the boiling temperature at 1 bar was used as listed in Table 1. The exception is carbon dioxide, which does not liquefy at ambient pressure and so the pressure was set to 5.99 bar and temperature 220 K which is in the liquid-phase region, close to the triple point and matches conditions used in a 2PT study [50]. Simulations used SHAKE on all bonds involving hydrogen atoms, a non-bonded cutoff of 8 Å, periodic boundary conditions, particle-mesh Ewald summation with default parameters in AMBER, and a 2 fs timestep. Table 3 contains all the liquids simulated, for five of which the following abbreviations are used: dimethylformamide (DMFA), dimethylsulfoxide (DMSO), N-methyl acetamide (NMA), tert-butyl alcohol (TBA) and tetrafluoroethylene (TFE). Symmetry numbers in Equations (4) and (5) are listed in Table S1. Entropies were calculated with in-house C++ and Perl code, reading in the force, coordinate and topology files and writing out eigenvalues and coordination numbers.

3. Results

3.1. Entropy Values

Figure 1 presents the entropy of 50 of the liquids calculated by MCC for the OPLS and GAFF force fields plotted against the respective experimental values [63,64,65,66,67,68,69,70]. Table 2 gives the mean unsigned and signed deviations, slopes, intercepts, Pearson correlation coefficients R 2 , and zero-intercept slopes of entropies by the MCC and 2PT methods with respect to experiment. Table 3 contains the MCC entropy values for all 56 liquids, together with values from experiment, the MCC entropy of carbon dioxide with the TraPPE force field, and values using the 2PT method with the OPLS and GAFF force fields for fifteen liquids [48], carbon dioxide [50] and methanol and hexane including torsional fluidicity [51]. Statistical errors are negligible for the precision given.
The experimental entropies for most liquids were taken from the NIST Chemistry Webbook [63]. If more than one value was reported by different authors, all values were included, although for acetic acid, ethanol, ethylene glycol, formic acid, propanol and pyridine the spread is substantial, exceeding 10 J K 1 mol 1 . Entropies were found elsewhere for ammonia [64], chloroform [65], methane [66], hydrogen peroxide [67], hydrogen sulfide [68] and carbon dioxide [69]. Values for ethylamine and triethylamine were calculated from the experimental gas-phase entropy, enthalpy of vaporization, and either heat capacity at constant pressure or partial pressure [63,70] (see Table S2 for details). For the remaining six liquids no values could be found in the literature. The experimental entropy is averaged if there is more than one value.
The entropy values calculated by MCC agree well with experiment, with Table 2 showing a mean unsigned error of less than 10 J K 1 mol 1 , GAFF being slightly better than OPLS. The small mean signed errors, the slopes being marginally less than one, and the positive y-intercept suggest that MCC is slightly missing the dependence on molecular size, although forcing the line through zero brings about the correct unity slope. The excessive entropies seen for larger molecules in the earlier version of the theory [40] no longer occur because we no longer halve forces for hard internal degrees of freedom in the mean-field approximation.
Comparisons with experiment are affected by the accuracy of the force field. To compare MCC with 2PT, Table 2 contains the statistical quantities for the liquids studied by the 2PT method [48] listed in Table 3, comprising 14 with the GAFF force field [53] and 12 with the OPLS force field [62] together with the corresponding MCC values with GAFF and OPLS with 1.14*CM1A charges [52]. For both force fields, the mean unsigned error for 2PT is three times that of MCC, largely because the 2PT values are too small, shown by the negative mean error, negative y-intercept, and poorer correlation. The slope is close to unity but decreases when forced through the origin. The difference between the two methods is unlikely solely due to the different OPLS force fields, given the trend is present for GAFF, that the 2PT values using the earlier OPLS force field better reproduce liquid-phase entropy, and that the same trend was observed earlier when comparing with the same force field [40,71]. The variability in experiment for acetic acid and ethanol may affect this comparison, in that MCC is closer to the higher value and 2PT closer to the lower value, but this would be insufficient to affect the overall trend. The poorer MCC performance of OPLS with 1.14*CM1A charges compared to GAFF likely reflects the over-polarization of the charges to optimise their free energy of hydration [52]. This also likely explains the better performance of OPLS than GAFF for 2PT. Including the localized bond charge corrections, an alternative provided by LigParGen, is unlikely to lead to any improvement in entropy, given their mixed performance in calculating enthalpies of vaporization and density of liquids [52]. The more positive signed error for OPLS indicates that its entropies overall are larger than the GAFF entropies, implying that the combined intermolecular and intramolecular OPLS interactions are marginally weaker than GAFF. Contrary to this trend, the largest deviations between the force fields are OPLS being ∼20 J K 1 mol 1 lower than GAFF for ammonia and DMSO.

3.2. Entropy Components

To give deeper understanding into the values of the entropies, their six components in Equation (1) are illustrated in Figure 2 for the case of GAFF (Table S2 has numerical values for both force fields). Plotted in Figure 3 are the entropy components as a function of molecular mass, and Table 4 lists data for the lines of best fit. The first observation is the dominance of the molecular translational and rotational entropy, being more than half the total entropy for all but the largest molecules. S M transvib has a weak dependence on mass, deviating lower for systems at colder temperatures. S M rovib has a stronger mass-dependence and is lower for colder and linear molecules and those forming hydrogen bonds. One point to emphasise about our decomposition is that linear molecules in terms of united atoms, such as ethane or acetonitrile, have negligible rotational entropy about their long axis at the molecule level. The entropy about this axis including hydrogens is instead assigned to the united-atom level.
S UA rovib is slightly smaller, making up about a quarter of the total. It primarily comprises the twisting of united atoms such as methyls (∼17 J K 1 mol 1 ) and hydroxyls (∼13 J K 1 mol 1 ) as well as hydrogen bending, such as in benzene, and thus relates more specifically to the number of hydrogens. As mentioned earlier, for linear molecules with two united atoms, it also includes the entropy of rotation about the long axis because this term would otherwise be zero without hydrogen. For example, for ethene S M rovib is smaller than for other molecules, and most of its S UA rovib is rotational entropy about the long axis, leaving about 3 J K 1 mol 1 for internal motion. The remaining three terms are more variable and together make up about a quarter to a third of the total. The orientational term S M topo weakly increases with mass and is smaller for molecules with higher symmetry or those that form hydrogen bonds, which tend to reduce N c . S UA transvib mainly comprises dihedral vibration of united atoms and has a strong dependence on mass, as does the conformational term S UA topo , which is one of the smallest terms and only present for 13 liquids. The lines of best fit for each component indicate moderate predictability based on mass, but a thorough treatment is beyond the scope of this work. Comparing the force fields, GAFF has marginally higher molecular vibrational entropy (1.5 J K 1 mol 1 ) and higher S UA topo (5.2 J K 1 mol 1 ) whereas OPLS has more S UA rovib (2.2 J K 1 mol 1 ). Of the most extreme deviations, S UA topo of GAFF is 14 J K 1 mol 1 higher than OPLS for 2-butoxyethanol and 12 J K 1 mol 1 higher for diethanolamine. Why this is so is revealed by an inspection of the probability distributions in Table S4 which indicate that the reduced S UA topo for OPLS is because of stronger internal hydrogen-bonding. In more detail than looking at overall entropy, these trends imply that GAFF compared to OPLS has weaker intermolecular interactions, consistent with the charge over-polarisation of OPLS mentioned earlier [52], more evenly occupied conformations, and stronger intramolecular interactions, particularly relating to united-atom rotation.
A direct comparison of entropy components with 2PT for the 15 liquids in common [48] cannot be done because different OPLS force fields are used, but in general the 2PT molecular translational and rotational entropies are larger than the equivalent MCC terms, and the MCC terms become slightly larger upon inclusion of the orientational term. However, the three MCC united-atom terms are larger than the internal vibrational 2PT term, which in that work did not include a fluidicity term, as noted earlier [40]. However, later formulation of such a term [51] applied to ethane, methanol and hexane shows that the torsional fluidicity is only a few percent of the vibrational term, thus not being responsible for the difference with MCC.

3.3. Covariance Matrices and Coordination and Dihedral Distributions

Representative plots in Figure 4 show the force and torque covariance matrices respectively for the liquids using the GAFF force field (see Figures S3 and S4 for all molecules). Similar to the combined force-torque matrices in earlier work [40], force covariance matrices show maximum auto-correlation along the diagonal and strong anti-correlation for bonded atoms. Correlations between more distant atoms are only evident for more rigid molecules, consistent with their lower vibrational entropy. Torque covariance matrices have weak correlations, most ranging from negligible up to a tenth of the diagonal self-correlation, consistent with the mean-field approximation made for united-atom rotation. Only very rigid molecules such as ethene display large correlations but their associated entropy is very small. Molecule-level matrices are not shown, being near-purely diagonal.
Representative p ( N c ) distributions of five liquids with the GAFF force field are shown in Figure 4 (see Figure S3 for all liquids). As expected for liquids, these distributions are broad and roughly Gaussian, most peaking between N c = 5 and 10. As Equation (4) makes clear, larger coordination brings about larger orientational entropy. The outliers with higher coordination are the six-membered rings such as cyclohexane, piperidine and 1,4-dioxane, and carbon dioxide, versus the hydrogen-bonded molecules whose hydrogen-bonds bring about more directed interactions and lower N c , such as methanol, diethanolamine and octanol, the last of which is slightly liquid-crystalline.
The dihedral probability distributions p i are given in Table S4 for the 13 molecules with united-atom dihedrals. Of the 11 molecules with more than one dihedral, the correlation matrix brings about only a small reduction in entropy relative to the ideal value for independent dihedrals, indicating that conformations in these non-ring systems are weakly correlated. The largest reductions are −4.2 and 1.0 J K 1 mol 1 for OPLS and GAFF triethylamine, followed by −1.0 and 0.4 J K 1 mol 1 for OPLS and GAFF 2-butoxyethanol and −0.6 J K 1 mol 1 for both OPLS and GAFF octanol. However, for the ring molecules, such as cyclohexane, piperidine and 1,4-dioxane, which have six fully correlated dihedrals the method correctly picks out their two possible conformers as eigenvectors with eigenvalues according to their probability, with all other eigenvalues being zero. In the short timescale here, only a few molecules in each system convert to the other conformer. Achieving equilibrium is unnecessary for cyclohexane and 1,4-dioxane because both conformers are identical and contribute no entropy. However, the equatorial and axial conformers of piperidine are distinct, with the equatorial hydrogen on the nitrogen being lower in energy by 1.7 K J mol 1 [72], which would increase entropy by ∼5 J K 1 mol 1 .

4. Discussion

We have extended our MCC method to calculate entropy for a much broader range of 56 liquids than the 14 liquids studied previously [40]. To emphasise the advantages of MCC, it is simple in its theoretical formulation, informative by giving an entropy decomposition over all degrees of freedom, rapidly convergent in the number of simulation frames required, scalable to large systems with its multiscale formulation, near-general and applicable to a huge range of molecular systems, and accurate to the level of the thermal energy k B T for the liquids studied here.
Of the improvements incorporated in this work, the first is the recognition that the force-halving arising from the mean-field cell approximation should not be applied to bonded united atoms because of their strong correlation. This leads to lower entropies than previously [40], which is especially important for the larger molecules such as toluene or cyclohexane. The good agreement obtained earlier for single flexible molecules [39] was likely obtained due to a cancellation of errors, with the missing rotational entropy of united atoms offsetting the larger entropy due to force halving in the force covariance matrix. Nonetheless, averaged out correlations in the force and torque covariance matrices owing to conformational fluctuations may account for MCC entropies being lower than experiment for larger molecules. A more minor modification from previous work [40] relates to the use of separate force and torque covariance matrices, rather than a combined force-torque covariance matrix, owing to weak correlation between forces and torques, a change which improves the computational efficiency of the method. This work shows that subunit torques are weakly correlated in most cases, meaning that even the torque covariance may be unnecessary.
The second principle improvement is the correlation matrix to account for the correlation of dihedral conformations by expressing the conformational distribution in terms of a basis of conformers. A key feature of the correlation-matrix method is that it efficiently scales to large systems, with matrix size increasing as N dih 2 . Considering each conformer separately goes exponentially as 3 N dih and would become unfeasible beyond N dih > 10 . The traditional approach using correlations in continuously valued dihedral angles has an even worse exponential dependence and goes as N bin N dih , where N bin is the number of bins. This is already problematic for N dih > 2 , but it can be somewhat relieved by nearest-neighbour methods [16,22,23,24]. It is reasonable to assume that dihedral correlations need only be considered for local energy wells rather than for the numerical value of the dihedral, given that this correlation is unlikely to change on the timescale of molecular vibration.
A third issue to consider in future work is the multiscale approximation in how different levels of hierarchy are defined, how to avoid the double-counting of entropy between different levels of hierarchy, and how to streamline the theory further so that it is essentially equivalent at every level of hierarchy to maximise generality. Ideally, the determination of each level would be automated and dynamic, adjusting to the level of order in the system. Care is needed to ensure that the translational or rotational entropy duplicates that at the higher level for every level of hierarchy so that it is cleanly removed. The theory for vibrational entropy is already quite general for any level of hierarchy, while the topographical terms require more work to fuse Equations (4) and (8) into the same formulation. This would involve generalising the orientational entropy to be non-ideal so that orientations have different weightings according to the orientations of the neighbouring molecules, as has been already studied for water with its strongly directional hydrogen bonds [35,37,42,44]. Including the united-atom orientational entropy could be extended to other molecules such as alcohols and amines. Nonetheless, the framework is in place to scale the method to simulated systems of greater complexity.

5. Conclusions

We have presented the multiscale cell correlation method to calculate the entropy of 56 molecular liquids from molecular dynamics simulations. The entropies are in excellent agreement with experiment for the OPLS and GAFF force field, with GAFF performing slightly better. Agreement is better than that of the 2PT method, which can also calculate the entropy of molecular liquids. The components of entropy give an insightful and intuitive understanding of the values obtained. With suitably chosen levels of hierarchy, the method is readily scalable to larger and more complex systems.

Supplementary Materials

The following are available online at https://www.mdpi.com/1099-4300/21/8/750/s1, Table S1: Symmetry Number for Each Molecule, Table S2: Experimental Data to Calculate Liquid-Phase Entropy, Table S3: MCC Entropy Components (J K 1 mol 1 ) for OPLS and GAFF, Table S4: Conformation Probabilities, Figure S1: United-atom (UA) force covariance matrices, Figure S2: UA torque covariance matrices, Figure S3: Probability distribution functions p ( N c ) of coordination number.

Author Contributions

Conceptualization, resources, supervision and project administration, R.H.H.; investigation and funding acquisition, H.S.A.; methodology and software, J.H. and R.H.H.; validation, visualization and writing—original draft preparation, H.S.A. and R.H.H.; formal analysis and writing—review and editing, H.S.A., J.H. and R.H.H.

Funding

HSA was funded by the Punjab Education Endowment Fund Pakistan and the School of Chemistry at the University of Manchester. Support is acknowledged from IT Services at the University of Manchester for providing the Computational Shared Facility for running the simulations and to CCPBiosim under EPSRC grant EP/M022609/1 for providing training workshops on software and programming.

Acknowledgments

Helpful discussions were had with Frauke Gräter on multiscale and mean-field aspects.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MCCMultiscale Cell Correlation
2PT2-Phase Thermodynamics
GAFFGeneralized AMBER Force Field
OPLSOptimized Potentials for Liquid Simulations
TraPPETransferable Potentials for Phase Equilibria

References

  1. Peter, C.; Oostenbrink, C.; van Dorp, A.; van Gunsteren, W.F. Estimating entropies from molecular dynamics simulations. J. Chem. Phys. 2004, 120, 2652–2661. [Google Scholar] [CrossRef] [PubMed]
  2. Zhou, H.X.; Gilson, M.K. Theory of Free Energy and Entropy in Noncovalent Binding. Chem. Rev. 2009, 109, 4092–4107. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Van Speybroeck, V.; Gani, R.; Meier, R.J. The calculation of thermodynamic properties of molecules. Chem. Soc. Rev. 2010, 39, 1764–1779. [Google Scholar] [CrossRef] [PubMed]
  4. Polyansky, A.A.; Zubac, R.; Zagrovic, B. Estimation of Conformational Entropy in Protein-Ligand Interactions: A Computational Perspective. In Computational Drug Discovery and Design; Baron, R., Ed.; Springer: Berlin, Germany, 2012; Volume 819, pp. 327–353. [Google Scholar]
  5. Baron, R.; McCammon, J.A. Molecular Recognition and Ligand Association. Ann. Rev. Phys. Chem. 2013, 64, 151–175. [Google Scholar] [CrossRef] [PubMed]
  6. Suárez, D.; Diaz, N. Direct methods for computing single-molecule entropies from molecular simulations. Rev. Comput. Sci. 2015, 5, 1–26. [Google Scholar] [CrossRef]
  7. Kassem, S.; Ahmed, M.; El-Sheikh, S.; Barakat, K.H. Entropy in bimolecular simulations: A comprehensive review of atomic fluctuations-based methods. J. Mol. Graph. Model. 2015, 62, 105–117. [Google Scholar] [CrossRef] [PubMed]
  8. Butler, K.T.; Walsh, A.; Cheetham, A.K.; Kieslich, G. Organised chaos: Entropy in hybrid inorganic-organic systems and other materials. Chem. Sci. 2016, 7, 6316–6324. [Google Scholar] [CrossRef] [PubMed]
  9. Chong, S.H.; Chatterjee, P.; Ham, S. Computer Simulations of Intrinsically Disordered Proteins. Ann. Rev. Phys. Chem. 2017, 68, 117–134. [Google Scholar] [CrossRef] [PubMed]
  10. Huggins, D.J.; Biggin, P.C.; Damgen, M.A.; Essex, J.W.; Harris, S.A.; Henchman, R.H.; Khalid, S.; Kuzmanic, A.; Laughton, C.A.; Michel, J.; et al. Biomolecular simulations: From dynamics and mechanisms to computational assays of biological activity. WIREs Comput. Mol. Sci. 2019, 9, e1393. [Google Scholar] [CrossRef]
  11. Edholm, O.; Berendsen, H.J.C. Entropy estimation from simulations of non-diffusive systems. Mol. Phys. 1984, 51, 1011–1028. [Google Scholar] [CrossRef]
  12. Wallace, D.C. On the role of density-fluctuations in the entropy of a fluid. J. Chem. Phys. 1987, 87, 2282–2284. [Google Scholar] [CrossRef]
  13. Baranyai, A.; Evans, D.J. Direct entropy calculation from computer-simulation of liquids. Phys. Rev. A 1989, 40, 3817–3822. [Google Scholar] [CrossRef] [PubMed]
  14. Lazaridis, T.; Karplus, M. Orientational correlations and entropy in liquid water. J. Chem. Phys. 1996, 105, 4294–4316. [Google Scholar] [CrossRef]
  15. Killian, B.J.; Kravitz, J.Y.; Gilson, M.K. Extraction of configurational entropy from molecular simulations via an expansion approximation. J. Chem. Phys. 2007, 127, 024107. [Google Scholar] [CrossRef] [PubMed]
  16. Hnizdo, V.; Tan, J.; Killian, B.J.; Gilson, M.K. Efficient calculation of configurational entropy from molecular simulations by combining the mutual-information expansion and nearest-neighbor methods. J. Comput. Chem. 2008, 29, 1605–1614. [Google Scholar] [CrossRef] [Green Version]
  17. King, B.M.; Silver, N.W.; Tidor, B. Efficient Calculation of Molecular Configurational Entropies Using an Information Theoretic Approximation. J. Phys. Chem. B 2012, 116, 2891–2904. [Google Scholar] [CrossRef] [Green Version]
  18. Suárez, E.; Suárez, D. Multibody local approximation: Application to conformational entropy calculations on biomolecules. J. Chem. Phys. 2012, 137, 084115. [Google Scholar] [CrossRef]
  19. Goethe, M.; Gleixner, J.; Fita, I.; Rubi, J.M. Prediction of Protein Configurational Entropy (Popcoen). J. Chem. Theory Comput. 2018, 14, 1811–1819. [Google Scholar] [CrossRef]
  20. Goethe, M.; Fita, I.; Rubi, J.M. Testing the mutual information expansion of entropy with multivariate Gaussian distributions. J. Chem. Phys. 2017, 147, 224102. [Google Scholar] [CrossRef] [Green Version]
  21. Suárez, E.; Diaz, N.; Suárez, D. Entropy Calculations of Single Molecules by Combining the Rigid-Rotor and Harmonic-Oscillator Approximations with Conformational Entropy Estimations from Molecular Dynamics Simulations. J. Chem. Theory Comput. 2011, 7, 2638–2653. [Google Scholar] [CrossRef]
  22. Hnizdo, V.; Darian, E.; Fedorowicz, A.; Demchuk, E.; Li, S.; Singh, H. Nearest-neighbor nonparametric method for estimating the configurational entropy of complex molecules. J. Comput. Chem. 2007, 28, 655–668. [Google Scholar] [CrossRef] [PubMed]
  23. Hensen, U.; Lange, O.F.; Grubmüller, H. Estimating Absolute Configurational Entropies of Macromolecules: The Minimally Coupled Subspace Approach. PLoS ONE 2010, 5, e9179. [Google Scholar] [CrossRef] [PubMed]
  24. Huggins, D.J. Estimating Translational and Orientational Entropies Using the k-Nearest Neighbors Algorithm. J. Chem. Theory Comput. 2014, 10, 3617–3625. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Karplus, M.; Kushick, J.N. Methods for Estimating the Configuration Entropy of Macromolecules. J. Am. Chem. Soc. 1981, 14, 325–332. [Google Scholar]
  26. Schlitter, J. Estimation of absolute and relative entropies of macromolecules using the covariance-matrix. Chem. Phys. Lett. 1993, 215, 617–621. [Google Scholar] [CrossRef]
  27. Andricioaei, I.; Karplus, M. On the calculation of entropy from covariance matrices of the atomic fluctuations. J. Chem. Phys. 2001, 115, 6289–6292. [Google Scholar] [CrossRef]
  28. Chang, C.E.; Chen, W.; Gilson, M.K. Evaluating the accuracy of the quasiharmonic approximation. J. Chem. Theory. Comput. 2005, 1, 1017–1028. [Google Scholar] [CrossRef]
  29. Reinhard, F.; Grubmüller, H. Estimation of absolute solvent and solvation shell entropies via permutation reduction. J. Chem. Phys. 2007, 126, 014102. [Google Scholar] [CrossRef]
  30. Dinola, A.; Berendsen, H.J.C.; Edholm, O. Free-energy determination of polypeptide conformations generated by molecular-dynamics. Macromolecules 1984, 17, 2044–2050. [Google Scholar] [CrossRef]
  31. Hikiri, S.; Yoshidome, T.; Ikeguchi, M. Computational Methods for Configurational Entropy Using Internal and Cartesian Coordinates. J. Chem. Theory Comput. 2016, 12, 5990–6000. [Google Scholar] [CrossRef] [Green Version]
  32. Gyimesi, G.; Zavodszky, P.; Szilagyi, A. Calculation of Configurational Entropy Differences from Conformational Ensembles Using Gaussian Mixtures. J. Chem. Theory Comput. 2017, 13, 29–41. [Google Scholar] [CrossRef]
  33. Lin, S.T.; Blanco, M.; Goddard, W.A. The two-phase model for calculating thermodynamic properties of liquids from molecular dynamics: Validation for the phase diagram of Lennard-Jones fluids. J. Chem. Phys. 2003, 119, 11792–11805. [Google Scholar] [CrossRef] [Green Version]
  34. Henchman, R.H. Partition function for a simple liquid using cell theory parametrized by computer simulation. J. Chem. Phys. 2003, 119, 400–406. [Google Scholar] [CrossRef]
  35. Henchman, R.H. Free energy of liquid water from a computer simulation via cell theory. J. Chem. Phys. 2007, 126, 064504. [Google Scholar] [CrossRef]
  36. Klefas-Stennett, M.; Henchman, R.H. Classical and quantum Gibbs free energies and phase behavior of water using simulation and cell theory. J. Phys. Chem. B 2008, 112, 3769–3776. [Google Scholar] [CrossRef] [PubMed]
  37. Henchman, R.H.; Irudayam, S.J. Topological hydrogen-bond definition to characterize the structure and dynamics of liquid water. J. Phys. Chem. B 2010, 114, 16792–16810. [Google Scholar] [CrossRef] [PubMed]
  38. Green, J.A.; Irudayam, S.J.; Henchman, R.H. Molecular interpretation of Trouton’s and Hildebrand’s rules for the entropy of vaporization of a liquid. J. Chem. Thermodyn. 2011, 43, 868–872. [Google Scholar] [CrossRef]
  39. Hensen, U.; Gräter, F.; Henchman, R.H. Macromolecular Entropy Can Be Accurately Computed from Force. J. Chem. Theory Comput. 2014, 10, 4777–4781. [Google Scholar] [CrossRef]
  40. Higham, J.; Chou, S.Y.; Gräter, F.; Henchman, R.H. Entropy of Flexible Liquids from Hierarchical Force-Torque Covariance and Coordination. Mol. Phys. 2018, 116, 1965–1976. [Google Scholar] [CrossRef]
  41. Irudayam, S.J.; Henchman, R.H. Entropic Cost of Protein-Ligand Binding and Its Dependence on the Entropy in Solution. J. Phys. Chem. B 2009, 113, 5871–5884. [Google Scholar] [CrossRef]
  42. Irudayam, S.J.; Henchman, R.H. Solvation theory to provide a molecular interpretation of the hydrophobic entropy loss of noble gas hydration. J. Phys. Condens. Matter 2010, 22, 284108. [Google Scholar] [CrossRef] [PubMed]
  43. Irudayam, S.J.; Plumb, R.D.; Henchman, R.H. Entropic trends in aqueous solutions of common functional groups. Faraday Discuss. 2010, 145, 467–485. [Google Scholar] [CrossRef]
  44. Irudayam, S.J.; Henchman, R.H. Prediction and interpretation of the hydration entropies of monovalent cations and anions. Mol. Phys. 2011, 109, 37–48. [Google Scholar] [CrossRef] [Green Version]
  45. Gerogiokas, G.; Calabro, G.; Henchman, R.H.; Southey, M.W.Y.; Law, R.J.; Michel, J. Prediction of Small Molecule Hydration Thermodynamics with Grid Cell Theory. J. Chem. Theory Comput. 2014, 10, 35–48. [Google Scholar] [CrossRef]
  46. Michel, J.; Henchman, R.H.; Gerogiokas, G.; Southey, M.W.Y.; Mazanetz, M.P.; Law, R.J. Evaluation of Host-Guest Binding Thermodynamics of Model Cavities with Grid Cell Theory. J. Chem. Theory Comput. 2014, 10, 4055–4068. [Google Scholar] [CrossRef]
  47. Gerogiokas, G.; Southey, M.W.Y.; Mazanetz, M.P.; Heifetz, A.; Bodkin, M.; Law, R.J.; Henchman, R.H.; Michel, J. Assessment of Hydration Thermodynamics at Protein Interfaces with Grid Cell Theory. J. Phys. Chem. B 2016, 120, 10442–10452. [Google Scholar] [CrossRef]
  48. Pascal, T.A.; Lin, S.T.; Goddard, W.A. Thermodynamics of liquids: Standard molar entropies and heat capacities of common solvents from 2PT molecular dynamics. Phys. Chem. Chem. Phys. 2011, 13, 169–181. [Google Scholar] [CrossRef]
  49. Lin, S.T.; Maiti, P.K.; Goddard, W.A. Two-Phase Thermodynamic Model for Efficient and Accurate Absolute Entropy of Water from Molecular Dynamics Simulations. J. Phys. Chem. B 2010, 114, 8191–8198. [Google Scholar] [CrossRef]
  50. Huang, S.N.; Pascal, T.A.; Goddard, W.A.; Maiti, P.K.; Lin, S.T. Absolute Entropy and Energy of Carbon Dioxide Using the Two-Phase Thermodynamic Model. J. Chem. Theory Comput. 2011, 7, 1893–1901. [Google Scholar] [CrossRef]
  51. Lai, P.K.; Lin, S.T. Rapid determination of entropy for flexible molecules in condensed phase from the two-phase thermodynamic model. RSC Adv. 2014, 4, 9522–9533. [Google Scholar] [CrossRef]
  52. Dodda, L.S.; Cabeza de Vaca, I.; Tirado-Rives, J.; Jorgensen, W.L. LigParGen web server: An automatic OPLS-AA parameter generator for organic ligands. Nucleic Acids Res. 2017, 45, W331–W336. [Google Scholar] [CrossRef] [PubMed]
  53. Wang, J.M.; Wolf, R.M.; Caldwell, J.W.; Kollman, P.A.; Case, D.A. Development and testing of a general amber force field. J. Comput. Chem. 2004, 25, 1157–1174. [Google Scholar] [CrossRef] [PubMed]
  54. Higham, J.; Henchman, R.H. Locally adaptive method to define coordination shell. J. Chem. Phys. 2016, 145, 084108. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  55. Higham, J.; Henchman, R.H. Overcoming the limitations of cutoffs for defining atomic coordination in multicomponent systems. J. Comput. Chem. 2018, 39, 699–703. [Google Scholar] [CrossRef] [PubMed]
  56. Von Neumann, J. Mathematical Foundations of Quantum Mechanics; Princeton University Press: Princeton, NJ, USA, 1955. [Google Scholar]
  57. Case, D.A.; Berryman, J.T.; Betz, R.M.; Cerutti, D.S.; Cheatham, T.E., III; Darden, T.A.; Duke, R.E.; Giese, T.J.; Gohlke, H.; Goetz, A.W.; et al. AMBER 2015; University of California: San Francisco, CA, USA, 2015. [Google Scholar]
  58. Potoff, J.J.; Siepmann, J.I. Vapor-liquid equilibria of mixtures containing alkanes, carbon dioxide, and nitrogen. AICHE J. 2001, 47, 1676–1682. [Google Scholar] [CrossRef]
  59. Wang, J.M.; Wang, W.; Kollman, P.A.; Case, D.A. Automatic atom type and bond type perception in molecular mechanical calculations. J. Mol. Graph. Model. 2006, 25, 247–260. [Google Scholar] [CrossRef] [PubMed]
  60. Martinez, L.; Andrade, R.; Birgin, E.G.; Martinez, J.M. PACKMOL: A Package for Building Initial Configurations for Molecular Dynamics Simulations. J. Comput. Chem. 2009, 30, 2157–2164. [Google Scholar] [CrossRef]
  61. Abraham, M.J.; Murtola, T.; Schultz, R.; Páll, S.; Smith, J.C.; Hess, B.; Lindahl, E. GROMACS: High performance molecular simulations through multi-level parallelism from laptops to supercomputers. SoftwareX 2015, 1–2, 19–25. [Google Scholar] [CrossRef]
  62. Jorgensen, W.L.; Tirado-Rives, J. Potential energy functions for atomic-level simulations of water and organic and biomolecular systems. Proc. Natl. Acad. Sci. USA 2005, 102, 6665–6670. [Google Scholar] [CrossRef] [Green Version]
  63. National Institute of Standards and Technology. Standard Reference Database Number 69. In NIST Chemistry Webbook; National Institute of Standards and Technology: Gaithersburg, MD, USA, 2018. [Google Scholar]
  64. Overstreet, R.; Giaque, W.F. Ammonia. The Heat Capacity and Vapor Pressure of Solid and Liquid. Heat of Vaporization. The Entropy Values from Thermal and Spectroscopic Data. J. Am. Chem. Soc. 1937, 59, 254–259. [Google Scholar] [CrossRef]
  65. Lide, D.R. (Ed.) CRC Handbook of Chemistry and Physics, 99th ed.; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  66. Younglove, B.; Ely, J. Thermo-physical Properties of Fluids. II. Methane, Ethane, Propane, Isotutane, and Normal Butane. J. Phys. Chem. Ref. Data 1987, 16, 577–798. [Google Scholar] [CrossRef]
  67. Giguère, P.A.; Liu, I.D.; Dugdale, J.S.; Morrison, J.A. Hydrogen peroxide—The low temperature heat capacity of the solid and the 3rd law entropy. Can. J. Chem. 1954, 32, 117–128. [Google Scholar] [CrossRef]
  68. Giaque, W.F.; Blue, R.W. Hydrogen Sulfide. The Heat Capacity and Vapor Pressure of Solid and Liquid. The Heat of Vaporization. A Comparison of Thermodynamic and Spectroscopic Values of the Entropy. J. Am. Chem. Soc. 1936, 58, 831–837. [Google Scholar] [CrossRef]
  69. Perry, R.H.; Green, D.W.; Maloney, J.O. Perry’s Chemical Engineers’ Handbook, 8th ed.; McGraw-Hill: New York, NY, USA, 2007. [Google Scholar]
  70. Stull, D.R.; Westrum, E.F., Jr.; Sinke, G.C. The Chemical Thermodynamics of Organic Compounds; Wiley: New York, NY, USA, 1969. [Google Scholar]
  71. Caleman, C.; van Maanen, P.J.; Hong, M.Y.; Hub, J.S.; Costa, L.T.; van der Spoel, D. Force Field Benchmark of Organic Liquids: Density, Enthalpy of Vaporization, Heat Capacities, Surface Tension, Isothermal Compressibility, Volumetric Expansion Coefficient, and Dielectric Constant. J. Chem. Theory Comput. 2012, 8, 61–74. [Google Scholar] [CrossRef] [PubMed]
  72. Blackburne, I.D.; Katritzky, A.R.; Takeuchi, Y. Conformation of piperidine and of derivatives with additional ring heteroatoms. Acc. Chem. Res. 1975, 8, 300–306. [Google Scholar] [CrossRef]
Figure 1. Multiscale cell correlation (MCC) entropy values versus experiment for OPLS (blue), GAFF (red), and TraPPE (green), together with the line of perfect agreement (dotted).
Figure 1. Multiscale cell correlation (MCC) entropy values versus experiment for OPLS (blue), GAFF (red), and TraPPE (green), together with the line of perfect agreement (dotted).
Entropy 21 00750 g001
Figure 2. MCC entropy components for GAFF (bottom to top): molecular-translational (dark blue), molecular rotational (blue), molecular topographical (cyan), united-atom translational (dark red) united-atom rotational (red), and united-atom topographical (orange).
Figure 2. MCC entropy components for GAFF (bottom to top): molecular-translational (dark blue), molecular rotational (blue), molecular topographical (cyan), united-atom translational (dark red) united-atom rotational (red), and united-atom topographical (orange).
Entropy 21 00750 g002
Figure 3. MCC entropy components for GAFF versus molecular mass for all liquids. The colouring is as in Figure 2.
Figure 3. MCC entropy components for GAFF versus molecular mass for all liquids. The colouring is as in Figure 2.
Entropy 21 00750 g003
Figure 4. Panels for five representative liquids (GAFF) illustrating the UA force (left) and torque (right) covariance matrices and coordination-number probability distributions p ( N c ) (lower). For the matrices, white and black represent correlations of 1 and −1, respectively, with grey in between. The matrix origin is at the lower left.
Figure 4. Panels for five representative liquids (GAFF) illustrating the UA force (left) and torque (right) covariance matrices and coordination-number probability distributions p ( N c ) (lower). For the matrices, white and black represent correlations of 1 and −1, respectively, with grey in between. The matrix origin is at the lower left.
Entropy 21 00750 g004
Table 1. Boiling Temperature of Liquids [63] that are Gaseous at Ambient Conditions.
Table 1. Boiling Temperature of Liquids [63] that are Gaseous at Ambient Conditions.
LiquidT/KLiquidT/KLiquidT/KLiquidT/K
ammonia240ethane185hydrogen sulfide213methylamine267
butane272ethene170methane112propane231
carbon dioxide220 a ethylamine291methanethiol279TFE197
diazene275
a Pressure is 5.99 bar.
Table 2. Statistical Data for MCC and 2-Phase Thermodynamics (2PT) versus Experiment.
Table 2. Statistical Data for MCC and 2-Phase Thermodynamics (2PT) versus Experiment.
Data Set (Number of Liquids) | S S expt | /J K 1 mol 1 S S expt /J K 1 mol 1 SlopeY-Intercept/J K 1 mol 1 R 2 Zero-Intercept Slope
MCC OPLS a (46)9.80.60.9411.70.951.00
MCC GAFF (50)8.7−0.3 0.9313.00.960.99
2PT OPLS b (12)15.5−15.6 1.05−25.3 0.840.92
2PT GAFF (14)28.0−24.4 0.97−19.5 0.550.87
MCC OPLS a (12)4.92.30.8726.70.891.01
MCC GAFF (14)7.64.00.9316.50.931.02
a OPLS with 1.14*CM1A charges [52]; b OPLS with charges optimised to liquid-phase properties [62].
Table 3. Entropy by Experiment, MCC and 2PT (J K 1  mol 1 ).
Table 3. Entropy by Experiment, MCC and 2PT (J K 1  mol 1 ).
LiquidExperiment a MCC2PT [48]
OPLSGAFFOPLSGAFF
acetic acid158, 194177180147128
acetone200202206198187
acetonitrile150 143 145
ammonia87 b 7192
aniline191, 192205205
benzene173, 175183182172161
benzyl alcohol217216208
benzaldehyde221204204
butane227, 230, 231214212
butanol226, 228244235
2-butoxyethanol 293301
carbon dioxide118 c 111 d 106112 d
chloroform202 e203210193226
cyclohexane204, 206220212
diazene121125116
dichloromethane175190191
diethanolamine 248256
diethyl ether253, 254237236
DMFA 214222
DMSO189183202164159
1,4-dioxane197206199179159
ethane127125127
ethanol160, 161, 177177175141127
ethene118114120
ethyl acetate259254252
ethylamine189 f 181185
ethylene glycol167, 180172175141121
formamide 151153
formic acid128, 132, 143156145
furan177181186167157
hexane290, 295, 296273272251 g
hexanol287288281
hydrazine122120116
hydrogen peroxide110 h 126125
hydrogen sulfide106 i 101
methane79 j 7378
methanethiol163177172
methanol127, 130, 136139139117 g , 122109
methylamine150128133
NMA 205206181168
octanol 335331
pentane259, 263251250
pentanol255, 259264257
piperidine210234222
propane171176176
propanol193, 214213206
pyridine178, 179, 210191189
styrene238, 241223223
TBA190, 198218217
tetrahydrofuran204188192196159
TFE184 207195185
toluene219, 221224223204190
triethylamine309 f 292295
m-xylene252, 254248248
o-xylene246, 248245245
p-xylene244, 247, 253243243
a Reference [63] Experimental errors < 1 J K 1 mol 1 ; b Reference [64]; c References [50,69]; d TraPPE force field [58]; e Reference [65]; f Derived in Table S2 using References [63,70]; g Reference [51]; h Reference [67]; i Reference [68]; j Reference [66].
Table 4. Lines of Best Fit for the Entropy Components versus Molecular Mass.
Table 4. Lines of Best Fit for the Entropy Components versus Molecular Mass.
ComponentSlope/J K 1 g 1 Y-Intercept/J K 1 mol 1 R 2 ComponentSlope/J K 1 g 1 Y-Intercept/J K 1 mol 1 R 2
S M transvib 0.21500.54 S UA transvib 0.42140.63
S M rovib 0.35280.70 S UA rovib 0.4360.34
S M topo 0.09160.13 S UA topo 0.39160.87

Share and Cite

MDPI and ACS Style

Ali, H.S.; Higham, J.; Henchman, R.H. Entropy of Simulated Liquids Using Multiscale Cell Correlation. Entropy 2019, 21, 750. https://doi.org/10.3390/e21080750

AMA Style

Ali HS, Higham J, Henchman RH. Entropy of Simulated Liquids Using Multiscale Cell Correlation. Entropy. 2019; 21(8):750. https://doi.org/10.3390/e21080750

Chicago/Turabian Style

Ali, Hafiz Saqib, Jonathan Higham, and Richard H. Henchman. 2019. "Entropy of Simulated Liquids Using Multiscale Cell Correlation" Entropy 21, no. 8: 750. https://doi.org/10.3390/e21080750

APA Style

Ali, H. S., Higham, J., & Henchman, R. H. (2019). Entropy of Simulated Liquids Using Multiscale Cell Correlation. Entropy, 21(8), 750. https://doi.org/10.3390/e21080750

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop