# Following one’s passion

Simmi Puri |

I distinctly remember the farewell speech of my teacher Barman. Her parting words were, “The path of your life will fork many a time and you need to take one of them. Take the one less trodden by as few dare to do so.”

I always took the most challenging path till I realised that I was wrong. While success comes by challenging yourself, happiness comes by reducing entropy.

This year I experienced disorder, disintegrations, and randomness – as though something very powerful hit my head leading to complete disorientation. It was a fleeting moment but the impact was phenomenal.

We have all experienced something of this sort at some point in our lives. Our life is inextricably woven into our natural environment, the social and professional institutions we are associated with and even the myriad technological devices that we use. A little influence can cause a phenomenal displacement which means that cause and effect are not proportional. A small effect can have significant consequences while a major effort might yield little. In mathematics, we call such events nonlinear.

There is no explicit solution to a nonlinear mathematical problem. We face such problems in our real life and learn to deal with them. I will use the realm of physics and mathematics to explain how we can handle them or even prevent their occurrence.

The chaos theory is a very interesting idea. It explains how flapping of butterfly wings in Brazil will cause a tornado in Texas. The butterfly effect is a widely used term to describe nonlinear things that are effectively impossible to predict or control, like turbulence, weather, the stock market, our thought process and so on. A better way to express this is that small changes in the initial conditions lead to drastic results. Edward Lorenz was the pioneer of chaos theory. In 1961, Lorenz was working on weather prediction using a computer that calculated parameters up to the sixth decimals. He took a print which gave him numbers up to only third decimal places, and entered them-so instead of 5.123456, he inputted 5.123. The weather patterns the computer predicted from the new simulation were drastically different from what he had initially predicted.

Chaos is a science of surprises, of the nonlinear and unpredictable. If the butterfly had not flapped its wings at just the right point in space/time, the tornado would have not happened. Chaos theory also suggest that if one can understand all of the variables affecting a system, an underlying pattern will eventually emerge and it will be easier to predict outcomes. It is all about identifying patterns. Today data science has evolved to identify these patterns. Since it is impossible to measure the effects of all the butterflies in the world, accurate long-range weather prediction will always remain impossible. How do we measure this randomness or disorder to make it manageable?

Entropy is a measure of the disorder of a system. The term was coined by Rudolph Clausius in the 19th century. Disorder refers to the number of different microscopic states a system can be in at a particular point in time.

For example, place a small marble in a large box and shake it. The marble could be anywhere in the box. Place the same marble is a small box where its edges touch the surface of the marble. Shaking the box one will know where the marble is. The marble in the small box has low entropy. By reducing the possible positions the marble in the box could have, we can reduce its entropy.
For those who are mathematically inclined, disorder refers to the number of different states a system can be in, given that the composition of the system is fixed. Entropy = (Boltzmann’s constant k) x logarithm of number of possible states= k log (N).

Entropy is the universal tendency toward disorder, disorganisation, disintegration, and chaos. Knowing the entropy of a system can tell us many things about what can and can’t happen. Isaac Newton explained this in detail about 300 years ago. Concept of entropy originated from the study of heat, temperature, work, and energy – thermodynamics.

The second law of thermodynamics states that the universe evolves in such a way that its total entropy always stays the same or increases. We can see examples of entropy everywhere. Buildings left alone become ruins, soil gets eroded, cars in a junkyard get rusted, great civilisations eventually fall apart, and human body decays and decomposes. Our sun will eventually burn out. All organised systems tend to disorganise. In common parlance, we could say, “Everything around us is falling apart, all the time.”

Entropy is inevitable. The golden path of life is to manage entropy, and since it increases with the number of possible states a system or an individual can be in at a given point in time, the idea is to simplify and reduce options to obviate randomness and disorder. Untoward occurrences and events in the physical world can be averted if we know how to reduce the entropy associated with them.

Infusion of technology has affected the size of our box which here depicts our universe which is continuously expanding. Today ubiquity of the Internet and hyper-connectivity has expanded our physical space. We have a virtual life which is a natural extension of our physical life. This means that at a given point in time/space we now have more states-virtual personas and avatars. The fact that we can be in different states at a point in time has magnified our entropy manifold.

So there is chaos caused by randomness and disorder and it is continuously increasing. Entropy management is empowerment. For this one may reduce the size of the box, i.e. the universe, which surrounds us. It defines what one wants or what one loves. Since one’s work will fill a large part of one’s life, the only way to reduce the disorder (that translates into unhappiness) is to do what one loves.

If one hasn’t found one’s passion, then one must keep looking for it. One will know when he/she finds it. A path will carve out eliminating the choices and the randomness of the maze which will be the path of least entropy. So even if that path is the one less trodden by, it will be of least entropy as long as passion is followed.