From owner-chemistry@ccl.net Wed Mar 1 09:39:00 2017 From: "Jan Jensen compchemhighlights++gmail.com" To: CCL Subject: CCL: Computational Chemistry Highlight: February issue Message-Id: <-52661-170301032058-2866-hhg/LpP1uYezM/GSZ91fcg=-=server.ccl.net> X-Original-From: Jan Jensen Content-Type: multipart/alternative; boundary=94eb2c0932a6b1841e0549a6fd4d Date: Wed, 1 Mar 2017 09:20:52 +0100 MIME-Version: 1.0 Sent to CCL by: Jan Jensen [compchemhighlights]|[gmail.com] --94eb2c0932a6b1841e0549a6fd4d Content-Type: text/plain; charset=UTF-8 The February issue of Computational Chemistry Highlights is out. CCH is an overlay journal that identifies the most important papers in computational and theoretical chemistry published in the last 1-2 years. CCH is not affiliated with any publisher: it is a free resource run by scientists for scientists. You can read more about it here . Table of content for this issue features contributions from CCH editors Steven Bachrach and Jan Jensen: Towards full Quantum Mechanics based Protein-Ligand Binding Affinities Preparation of an ion with the highest calculated proton affinity: ortho-diethynylbenzene dianion Conformer-specific hydrogen atom tunnelling in trifluoromethylhydroxycarbene Interested in more? There are many ways to subscribe to CCH updates . Also, for your daily computational chemistry fix subscribe to Computational Chemistry Daily --94eb2c0932a6b1841e0549a6fd4d Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable

The February issue of=C2=A0= Compu= tational Chemistry Highlights=C2=A0is out.


CCH is an=C2=A0overlay journal=C2=A0that ide= ntifies the most important papers in computational and theoretical chemistr= y published in the last 1-2 years. CCH is not affiliated with any publisher= : it is a free resource run by scientists for scientists.=C2=A0You can read more about it here.


Table of content f= or this issue features contributions from CCH editors Steven Bachrach and J= an Jensen:


Towards full Quantum Mechanics based =C2=A0Pr= otein-Ligand Binding Affinities


<= /p>

C= onformer-specific hydrogen atom tunnelling in trifluoromethylhydroxycarbene=


I= nterested in more?=C2=A0There are many ways to subscribe to CCH updates.

<= p style=3D"margin:0px">

Also, for your daily= computational chemistry fix subscribe to=C2=A0Computational Chemistry Daily


--94eb2c0932a6b1841e0549a6fd4d-- From owner-chemistry@ccl.net Wed Mar 1 12:26:00 2017 From: "Michael Morgan michaelmorgan937:-:gmail.com" To: CCL Subject: CCL: Entropy of a Bimolecular System Message-Id: <-52662-170301121304-19374-Ymz8G3Q15w1PQGkqpXPbtA-*-server.ccl.net> X-Original-From: "Michael Morgan" Content-Language: en-us Content-Type: multipart/alternative; boundary="----=_NextPart_000_0001_01D2927C.C46EEE50" Date: Wed, 1 Mar 2017 11:12:50 -0600 MIME-Version: 1.0 Sent to CCL by: "Michael Morgan" [michaelmorgan937|*|gmail.com] This is a multipart message in MIME format. ------=_NextPart_000_0001_01D2927C.C46EEE50 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable Dear Mr. Guo, =20 Can you guide me citations of =E2=80=9Cthe connection between = information and thermodynamic entropy had been experimentally = confirmed=E2=80=9D? I am interested in the topic.=20 =20 Thank you very much. =20 Best, Michael Morgan =20 > From: owner-chemistry+michaelmorgan937=3D=3Dgmail.com%ccl.net = [mailto:owner-chemistry+michaelmorgan937=3D=3Dgmail.com%ccl.net] On = Behalf Of Hao-Bo Guo guohaobo::gmail.com Sent: Tuesday, February 7, 2017 10:06 PM To: Morgan, Michael Subject: CCL: Entropy of a Bimolecular System =20 Hi Mr Ernest, I think I do not really understand the entropy, too.=20 =20 The entropy that you mentioned, I think, is the thermodynamic entropy = defined in statistical mechanics, which is the sum of electronic, = rotational, translational and vibrational components. Except the = vibrational term that is derived via the frequency calculations, the = other terms are directly obtained from the ensemble that the system = belongs to. Entropy is additive, and the biomolecules are different only = in the total number of atoms as well as electrons. =20 Nevertheless, the above approach does not really address why entropy is = so significant. Entropy should be equivalent to uncertainty, it is a = statistical quantity and is better estimate through probability of the = states of the interested systems. This is exactly what Claude Shannon = had defined from his equation, which ultimately formulated the = information technology we appreciate today. The Shannon entropy can be = equivalent to the thermodynamic entropy multiplied with kbTln(2), where = KB is the Boltzmann's constant and T the temperature.=20 =20 This relationship is often traced back to the Maxwell's Demon proposed = some 150 years ago by James Clerk Maxwell. The connection between = information and thermodynamic entropy had been experimentally confirmed: = Erasure of information of 1 bit leads to dissipation of heat of = kbTln(2)---heat from information transmission and computations.=20 =20 Thanks, Hao-Bo Guo =20 On Feb 7, 2017 2:37 PM, "Ernest Chamot echamota/chamotlabs.com = " > wrote: Hi All, =20 I seem to have argued myself into a state of confusion: I guess I just = don=E2=80=99t really understand the entropy of a bimolecular system. =20 I can calculate the enthalpy of a molecule with any number of methods, = and so long as I also do an IR or frequency calculation, I can also get = the entropy, and ultimately the free energy of the molecule. So if I am = considering the equilibrium of a dissociation reaction, I can get the = heat of reaction by modeling all three species, and subtracting the = enthalpy of the reactant from the sum of the enthalpies of the products. = But how do I calculate the free energy of reaction? =20 I can=E2=80=99t just add up the individual free energies, can I? = Isn=E2=80=99t the entropy of the pair of product molecules different = > from just the sum of the two individual entropies? Since there are two = separate molecules in the same frame of reference, there should be an = additional 6 degrees of freedom for the second molecule, even at = infinite separation. Or do these all have a correspondence with a = vibrational mode in the original reactant molecule? Doesn=E2=80=99t = there need to be an additional term or factor: ln(2), or angular = momentum, or something? =20 (I=E2=80=99m interested in the overall reaction, not with the two = product molecules still bound together in some intermediate complex. = Otherwise I could just model that.) =20 Thanks for any help. =20 EC =20 =20 Ernest Chamot Chamot Labs, Inc. http://www.chamotlabs.com =20 =20 =20 ------=_NextPart_000_0001_01D2927C.C46EEE50 Content-Type: text/html; charset="utf-8" Content-Transfer-Encoding: quoted-printable

Dear Mr. Guo,

 

Can you guide me citations of =E2=80=9Cthe connection between = information and thermodynamic entropy had been experimentally = confirmed=E2=80=9D? I am interested in the topic. =

 

Thank you very much.

 

Best,

Michael Morgan

 

From:<= /b> = owner-chemistry+michaelmorgan937=3D=3Dgmail.com%ccl.net = [mailto:owner-chemistry+michaelmorgan937=3D=3Dgmail.com%ccl.net] On = Behalf Of Hao-Bo Guo guohaobo::gmail.com
Sent: Tuesday, = February 7, 2017 10:06 PM
To: Morgan, Michael = <michaelmorgan937%gmail.com>
Subject: CCL: Entropy of a = Bimolecular System

 

Hi Mr = Ernest,

I think I do not really = understand the entropy, too. 

 

The entropy that you mentioned, I think, is the = thermodynamic entropy defined in statistical mechanics, which is the sum = of electronic, rotational, translational and vibrational components. = Except the vibrational term that is derived via the frequency = calculations, the other terms are directly obtained from the ensemble = that the system belongs to. Entropy is additive, and the biomolecules = are different only in the total number of atoms as well as = electrons.

 

Nevertheless, the above approach does not really = address why entropy is so significant. Entropy should be equivalent to = uncertainty, it is a statistical quantity and is better estimate through = probability of the states of the interested systems. This is exactly = what Claude Shannon had defined from his equation, which ultimately = formulated the information technology we appreciate today. The Shannon = entropy can be equivalent to the thermodynamic entropy multiplied with = kbTln(2), where KB is the Boltzmann's constant and T the = temperature. 

 

This relationship is often traced back to the = Maxwell's Demon proposed some 150 years ago by James Clerk Maxwell. The = connection between information and thermodynamic entropy had been = experimentally confirmed: Erasure of information of 1 bit leads to = dissipation of heat of kbTln(2)---heat from information transmission and = computations. 

 

Thanks,

Hao-Bo Guo

 

On Feb 7, 2017 2:37 PM, "Ernest Chamot = echamota/chamotlabs.com" = <owner-chemistry ~ = ccl.net> wrote:

Hi = All,

 

I seem to have argued myself into a state of = confusion: I guess I just don=E2=80=99t really understand the entropy of = a bimolecular system.

 

I = can calculate the enthalpy of a molecule with any number of methods, and = so long as I also do an IR or frequency calculation, I can also get the = entropy, and ultimately the free energy of the molecule.  So if I = am considering the equilibrium of a dissociation reaction, I can get the = heat of reaction by modeling all three species, and subtracting the = enthalpy of the reactant from the sum of the enthalpies of the products. = But how do I calculate the free energy of = reaction?

 

I = can=E2=80=99t just add up the individual free energies, can I?  = Isn=E2=80=99t the entropy of the pair of product molecules different = > from just the sum of the two individual entropies?  Since there are = two separate molecules in the same frame of reference, there should be = an additional 6 degrees of freedom for the second molecule, even at = infinite separation.  Or do these all have a correspondence with a = vibrational mode in the original reactant molecule?  = Doesn=E2=80=99t there need to be an additional term or factor: =  ln(2), or angular momentum, or = something?

 

(I=E2=80=99m interested in the overall reaction, not = with the two product molecules still bound together in some intermediate = complex. Otherwise I could just model that.)

 

Thanks for any help.

 

EC

 

 

= Ernest Chamot

= Chamot Labs, Inc.

 

 

------=_NextPart_000_0001_01D2927C.C46EEE50-- From owner-chemistry@ccl.net Wed Mar 1 13:39:00 2017 From: "Hao-Bo Guo guohaobo^^^gmail.com" To: CCL Subject: CCL: Entropy of a Bimolecular System Message-Id: <-52663-170301133745-27721-c7YHq2x+tZmx3E9r2fj8rA===server.ccl.net> X-Original-From: Hao-Bo Guo Content-Type: multipart/alternative; boundary=94eb2c1942126b98d50549af9bee Date: Wed, 1 Mar 2017 13:37:38 -0500 MIME-Version: 1.0 Sent to CCL by: Hao-Bo Guo [guohaobo]^[gmail.com] --94eb2c1942126b98d50549af9bee Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hello Mr. Morgan, Thank you for this question regarding the relationship of information entropy (Shannon entropy) and the thermodynamic entropy. It can be traced back to the Maxwell's Demon proposed by James Clerk Maxwell some 150 years ago and related discussions thereafter, e.g., Leon Brillouin coined the term of negentropy (negative entropy) (J. Appl. Phys. 1951, 22, 334-337) and showed that the Shannon entropy (information entropy) could be converted to thermodynamic entropy by using a scaling factor of the Boltzmann's constant kB, and that one bit of information is equivalent to energy in amount of kBTln2. This relationship leads to an estimation that the information register of the whole Universe is c.a. 10^90 bits (Lloyd, S. Nature 2000, 406, 1047-1054). More importantly, Landauer (IBM J. Res. Dev. 1961, 5, 183-191) had proposed that information erasure is associated with dissipating heat into the environment on order of kBTln2 per bit. Landauer's principle was confirmed by experiments, see e.g.,: Berut, A. et al., Nature 2012, 483, 187-189. The importance of Landauer's principle lies in that it not only links the information theory to thermodynamics, but also set a lower limit of the dissipated heat from information procession and computations. A discussion could be found in this paper: Lutz, E. & Ciliberto, S., Information: from Maxwell's demon to Landauer's eraser. Phys. Today 2015, 68. Hao-Bo On Wed, Mar 1, 2017 at 12:12 PM, Michael Morgan michaelmorgan937:-:gmail.co= m wrote: > Dear Mr. Guo, > > > > Can you guide me citations of =E2=80=9Cthe connection between information= and > thermodynamic entropy had been experimentally confirmed=E2=80=9D? I am in= terested > in the topic. > > > > Thank you very much. > > > > Best, > > Michael Morgan > > > > *From:* owner-chemistry+michaelmorgan937=3D=3Dgmail.com^ccl.net [mailto: > owner-chemistry+michaelmorgan937=3D=3Dgmail.com^ccl.net] *On Behalf Of *H= ao-Bo > Guo guohaobo::gmail.com > *Sent:* Tuesday, February 7, 2017 10:06 PM > *To:* Morgan, Michael > *Subject:* CCL: Entropy of a Bimolecular System > > > > Hi Mr Ernest, > > I think I do not really understand the entropy, too. > > > > The entropy that you mentioned, I think, is the thermodynamic entropy > defined in statistical mechanics, which is the sum of electronic, > rotational, translational and vibrational components. Except the > vibrational term that is derived via the frequency calculations, the othe= r > terms are directly obtained from the ensemble that the system belongs to. > Entropy is additive, and the biomolecules are different only in the total > number of atoms as well as electrons. > > > > Nevertheless, the above approach does not really address why entropy is s= o > significant. Entropy should be equivalent to uncertainty, it is a > statistical quantity and is better estimate through probability of the > states of the interested systems. This is exactly what Claude Shannon had > defined from his equation, which ultimately formulated the information > technology we appreciate today. The Shannon entropy can be equivalent to > the thermodynamic entropy multiplied with kbTln(2), where KB is the > Boltzmann's constant and T the temperature. > > > > This relationship is often traced back to the Maxwell's Demon proposed > some 150 years ago by James Clerk Maxwell. The connection between > information and thermodynamic entropy had been experimentally confirmed: > Erasure of information of 1 bit leads to dissipation of heat of > kbTln(2)---heat from information transmission and computations. > > > > Thanks, > > Hao-Bo Guo > > > > On Feb 7, 2017 2:37 PM, "Ernest Chamot echamota/chamotlabs.com" ~ ccl.net> wrote: > > Hi All, > > > > I seem to have argued myself into a state of confusion: I guess I just > don=E2=80=99t really understand the entropy of a bimolecular system. > > > > I can calculate the enthalpy of a molecule with any number of methods, an= d > so long as I also do an IR or frequency calculation, I can also get the > entropy, and ultimately the free energy of the molecule. So if I am > considering the equilibrium of a dissociation reaction, I can get the hea= t > of reaction by modeling all three species, and subtracting the enthalpy o= f > the reactant from the sum of the enthalpies of the products. But how do I > calculate the free energy of reaction? > > > > I can=E2=80=99t just add up the individual free energies, can I? Isn=E2= =80=99t the > entropy of the pair of product molecules different > from just the sum of > the two individual entropies? Since there are two separate molecules in > the same frame of reference, there should be an additional 6 degrees of > freedom for the second molecule, even at infinite separation. Or do thes= e > all have a correspondence with a vibrational mode in the original reactan= t > molecule? Doesn=E2=80=99t there need to be an additional term or factor:= ln(2), > or angular momentum, or something? > > > > (I=E2=80=99m interested in the overall reaction, not with the two product > molecules still bound together in some intermediate complex. Otherwise I > could just model that.) > > > > Thanks for any help. > > > > EC > > > > > > Ernest Chamot > > Chamot Labs, Inc. > > http://www.chamotlabs.com > > > > > > > --94eb2c1942126b98d50549af9bee Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hello Mr. Morgan,

Thank you for thi= s question regarding the relationship of information entropy (Shannon entro= py) and the thermodynamic entropy. It can be traced back to the Maxwell'= ;s Demon proposed by James Clerk Maxwell some 150 years ago and related dis= cussions thereafter, e.g., Leon Brillouin coined the term of negentropy (ne= gative entropy) (J. Appl. Phys. 1951, 22, 334-337) and showed that the Shan= non entropy (information entropy) could be converted to thermodynamic entro= py by using a scaling factor of the Boltzmann's constant kB, and that o= ne bit of information is equivalent to energy in amount of kBTln2. This rel= ationship leads to an estimation that the information register of the whole= Universe is c.a. 10^90 bits (Lloyd, S. Nature 2000, 406, 1047-1054). More = importantly, Landauer (IBM J. Res. Dev. 1961, 5, 183-191) had proposed that= information erasure is associated with dissipating heat into the environme= nt on order of kBTln2 per bit. Landauer's principle was confirmed by ex= periments, see e.g.,: Berut, A. et al., Nature 2012, 483, 187-189.
The importance of Landauer's principle lies in that it not only l= inks the information theory to thermodynamics, but also set a lower limit o= f the dissipated heat from information procession and computations.
A discussion could be found in this paper: Lutz, E. & Ciliberto,= S., Information: from Maxwell's demon to Landauer's eraser. Phys. = Today 2015, 68.

Hao-Bo

On Wed, Mar 1, 2017 at 12:12 PM, Mic= hael Morgan michaelmorgan937:-:gmail.com <= span dir=3D"ltr"><owner-chemistry:+:ccl.net> wrote:

Dear Mr. Guo,

=C2=A0

C= an you guide me citations of =E2=80=9Cthe connection between information an= d thermodynamic entropy had been experimentally confirmed=E2=80=9D? I am in= terested in the topic.

=C2=A0

Thank you very much.

=C2=A0

Best,

= Michael Morgan

=C2=A0

From: owner-chemistry+michaelmorgan937=3D=3Dgma= il.com^ccl.net [m= ailto:owner-chemistry+michaelmorgan937=3D=3Dgmail.com^ccl.net] On Behalf Of Hao-Bo Guo guohaobo::gmail.com
Sent: Tuesday,= February 7, 2017 10:06 PM
To: Morgan, Michael <michaelmorgan= 937^gmail.com>
= Subject: CCL: Entropy of a Bimolecular System

<= p class=3D"MsoNormal">=C2=A0

H= i Mr Ernest,

I think I do not = really understand the entropy, too.=C2=A0

=C2=A0

The entropy that you mentioned, I think, is the thermodynamic entropy defi= ned in statistical mechanics, which is the sum of electronic, rotational, t= ranslational and vibrational components. Except the vibrational term that i= s derived via the frequency calculations, the other terms are directly obta= ined from the ensemble that the system belongs to. Entropy is additive, and= the biomolecules are different only in the total number of atoms as well a= s electrons.

=C2= =A0

Nevertheless, the above app= roach does not really address why entropy is so significant. Entropy should= be equivalent to uncertainty, it is a statistical quantity and is better e= stimate through probability of the states of the interested systems. This i= s exactly what Claude Shannon had defined from his equation, which ultimate= ly formulated the information technology we appreciate today. The Shannon e= ntropy can be equivalent to the thermodynamic entropy multiplied with kbTln= (2), where KB is the Boltzmann's constant and T the temperature.=C2=A0<= u>

=C2=A0

=

This relationship is often traced back to= the Maxwell's Demon proposed some 150 years ago by James Clerk Maxwell= . The connection between information and thermodynamic entropy had been exp= erimentally confirmed: Erasure of information of 1 bit leads to dissipation= of heat of kbTln(2)---heat from information transmission and computations.= =C2=A0

=C2=A0<= /u>

Thanks,

Hao-Bo Guo

=C2=A0

On Feb 7, 2017 2:37 PM, "Ernest Chamot echamota/chamotlabs.com" <owner-chemistr= y ~ ccl.net> wrote:

Hi All,

<= div>

=C2=A0

I seem to have argued myself into a state of confusion: I gu= ess I just don=E2=80=99t really understand the entropy of a bimolecular sys= tem.

=C2=A0

I can calculate the enthalpy of a mo= lecule with any number of methods, and so long as I also do an IR or freque= ncy calculation, I can also get the entropy, and ultimately the free energy= of the molecule.=C2=A0 So if I am considering the equilibrium of a dissoci= ation reaction, I can get the heat of reaction by modeling all three specie= s, and subtracting the enthalpy of the reactant from the sum of the enthalp= ies of the products. But how do I calculate the free energy of reaction?=

=C2=A0

I can=E2=80=99t just add up the individual = free energies, can I?=C2=A0 Isn=E2=80=99t the entropy of the pair of produc= t molecules different > from just the sum of the two individual entropie= s?=C2=A0 Since there are two separate molecules in the same frame of refere= nce, there should be an additional 6 degrees of freedom for the second mole= cule, even at infinite separation.=C2=A0 Or do these all have a corresponde= nce with a vibrational mode in the original reactant molecule?=C2=A0 Doesn= =E2=80=99t there need to be an additional term or factor: =C2=A0ln(2), or a= ngular momentum, or something?

=C2=A0

(I= =E2=80=99m interested in the overall reaction, not with the two product mol= ecules still bound together in some intermediate complex. Otherwise I could= just model that.)

=C2=A0

Thanks for any help.

=C2=A0

<= /div>

EC

=C2=A0

=C2=A0

Ernest Ch= amot

Chamot Labs, Inc.

=C2=A0

=C2=A0

=C2= =A0


--94eb2c1942126b98d50549af9bee-- From owner-chemistry@ccl.net Wed Mar 1 14:31:00 2017 From: "Ahmed Saeed ahmed_said5899-.-yahoo.com" To: CCL Subject: CCL:G: Perturbation theory energy analysis Message-Id: <-52664-170301142822-10667-rv6YwY9VQ5clSfDUlo2xAg() server.ccl.net> X-Original-From: "Ahmed Saeed" Date: Wed, 1 Mar 2017 14:28:21 -0500 Sent to CCL by: "Ahmed Saeed" [ahmed_said5899###yahoo.com] Dear colleagues, Hope you are doing well. I am studying the interaction between two organic molecules and I wanna study the effect of the non-bonding interactions on the stability of the formed adduct. My advisor told me to make a perturbation theory energy analysis to conduct this investigation. What are the steps of this analysis? And can I get it directly from the Gaussian 09 output file? I will be very grateful if anyone can help Thank you. Ahmed From owner-chemistry@ccl.net Wed Mar 1 20:22:01 2017 From: "ZhiPeng Li 979170845_-_qq.com" To: CCL Subject: CCL: reply: Perturbation theory energy analysis Message-Id: <-52665-170301194131-7998-joAwiw4mXw3XMt/+penfQA|server.ccl.net> X-Original-From: "ZhiPeng Li" <979170845,,qq.com> Date: Wed, 1 Mar 2017 19:41:30 -0500 Sent to CCL by: "ZhiPeng Li" [979170845%x%qq.com] Ahmed It is obvious that you want to study the weak interactions in your systems. although the post-HF theory, such as MP2, is used frequently, the DFT with dispersion correction (DFT-D3) is commended as usual, such as B3LYP-D3(BJ) and M06-2X-D3(zero), since the cost and consume of calculation is less than former. After computations, the energy decomposition is an efficient way to analysis the weak interactions, where the Morokuma method in Gamess-US program has been commended usually. Additionally, RDG, AIM and CDA analysis methods can be used by Multiwfn(http://multiwfn.codeplex.com/), which the details can be seen in the manual. Wish you everything goes well Li ZhiPeng 2017-03-02