Taming the Simple Entropies for Complicated Molecules

Lalit Koundaal

The Chemists of the University of Bonn, Germany have developed a computational tool for the analysis of conformational entropies of flexible molecules. The results are published in the journal Chemical Science and were highlighted as the “Pick of the Week” article.

The University of Bonn is one of the eleven German Universities with six clusters of Excellence, Nobel Prize winners, and Field Medalists.

In 1865, it was the German physicist Rudolf Clausius, who introduced the term “entropy“. He later worked and was rector at the University of Bonn. The year 2022 will be the 200th anniversary of his birthday and the University of Bonn has planned scientific events and celebrations to mark his anniversary.

Computational Tool-

Coming back to the computational tool!

In the latest attempt, Prof. Dr. Stefan Grimme and his co-workers from the Mulliken Center for Theoretical Chemistry at the University of Bonn developed a new computational tool for the calculation of conformational entropies to provide accurate thermodynamical descriptions of flexible molecules.

Stefan Grimme’s research group works on current topics like quantum chemistry with a focus on computational efficiency and large molecules. His co-worker Philipp Pracht is the mainstay of the program CREST employed for the conformational entropy calculations.

Their tool or the suggested method enables the thermodynamic investigation of complicated chemical systems by a combination of modern quantum chemical and classical models. In a successful attempt at simplifications, the entropy can be calculated even on standard desktop computers with minimal user intervention. 

While mathematical formulations for calculations of the conformational entropy have been known for quite some time, one main problem is finding and evaluating the huge number of possible structures reaching billions already for medium-sized molecules. 

Hence, a core component of the newly introduced and freely available software is an efficient algorithm for this task that works with minimal user input, even on standard desktop computers. To achieve the required efficiency, semiempirical quantum chemical methods were applied that are also developed in Grimme’s group, together with standard quantum mechanical calculations. 

In the article, it was shown that the procedure can treat even large and extremely flexible systems with unprecedented accuracy for molecular entropy. It’s the authors hope that the new computational protocol may help to obtain accurate thermodynamic data more routinely and that it will find widespread application in computational chemistry.

Entropy- a brief refresher

Entropy is commonly associated with a state of disorder or uncertainty is one of the most fundamental thermodynamic properties of matter. The concept of entropy has taken hold also in statistical mechanics, as pioneered by famous physicists Josiah Gibbs and Ludwig Boltzmann, and in information theory. Today, entropy finds its place as an active area of research across many scientific fields, including computational chemistry.

Well, all this sounds confusing and doesn’t help in understanding what entropy is.

Why was it invented in the first place?

It was during the industrial revolution that scientists wanted engines to be more efficient and the concept of entropy was born. It helped the engineers to make the engines better.

The statistical mechanic’s definition of entropy tells us that a system’s entropy is equal to the Boltzmann constant multiplied by the natural logarithm of the total number of possible microstates.

If a system has lots of possible microstates it can be arranged in, this means that the particles in that system can be arranged in many ways, and the entropy of the system is larger. These systems are known as “disordered” because of the large number of ways in which they could be arranged. 

However, systems with smaller values of entropy are more “ordered”. More the possible microstates, the larger the entropy. 

This brings us to the common description of entropy, as being a “measure of disorder”. 

One fine definition of entropy is that it’s a measure of how to spread out your energy is. And we have heard that entropy always increases which means that the energy will always move from a clumped upstate to spread out. This also means that in the future, eventually all the energy will be evenly spread out.

This is called the heat death of the universe which is scientists best guess at how the universe will end.

Well don’t worry- this is not going to happen for another ten thousand trillion … trillion…. years!! 





Share on facebook
Share on twitter
Share on linkedin

Leave a Reply

Your email address will not be published.