ENTROPY: The Greatest Blunder in the History of Science
B**T
This Book is About More than Just Entropy
Having some knowledge of psychophysics and its application to the social sciences, I am finding certain similarities in concepts and mathematical constructs between thermodynamics, statistical mechanics, information theory, and psychophysics as applied to social psychology. In this book, Arieh Ben-Naim provides examples and minimal mathematics which are clearly stated and easy to follow in the first two chapters, while at the same time doing so to promote the need for a more general definition which entropy falls within. The remaining two chapters address the various uses, misuses, and outright bizarre uses involving entropy and equilibrium by various experts. Chapter 3 and 4 could have been presented a little more delicately, but I have not had to professionally live amongst and watch as scientific experts created the conditions for a scientific Tower of Babel only to then wander off in confusion, speaking their own language so that they could no longer work together.Being able to cross academic disciplines is critical in today’s world. If we do not, we stagnate. Learning from others, their techniques, concepts, and applications avoids wasting time learning something from scratch within your own discipline. But that also means if I want to learn about concepts in Physics or Information Theory, then I would expect them to agree on their use of shared terms and concepts. For the human species who love to mark their social group boundaries, this in itself is a challenge, but if experts within a single discipline use various terms, concepts, and definitions, then there is little to no hope that another academic discipline can make use of the confused mess – and this to me is one of the most important point being exposed in “Entropy, The Greatest Blunder in the History of Science.”Dr, Ben-Naim may not provide the final answer in his book, but he should get credit for giving the discussion some needed energy. One concept in particular introduced in this book that caught my attention is that of using Maximum SMI to find the equilibrium distribution based on known constraints. Given the examples in Chapters 1 & 2, this is a very intuitive concept, and one that I can make use of in my work. Other experts would describe Maximum SMI instead as Maximum entropy, indicating it corresponds to the least amount of knowledge about the system under known constraints (i.e., determines an a priori distribution). I can live with the later using Bayesian statistics, but the equilibrium distribution concept provides a more robust, descriptive, and inclusive tool in the toolbox than the simple a priori distribution approach.In all, I view this book as a challenge to experts in the communities of thermodynamics, statistical mechanics, and information theory to work on this issue and come up with a single common language. Dr. Ben-Naim is aiding this effort by introducing what in many cases are extremely valid concepts for discussion, argument, and resolution. For that I think he should be either applauded or argued with, but not denounced as some in this comment section have done.Note Regarding Kindle: I use Kindle PC and had absolutely no problem with this book other than minimal difficultly highlighting.
E**X
Repetitive but very well done
Reading this reminded me of my 5th grade teacher who forced me to write the times tables (1 thru 12) 57 times upon finding 57 forbidden papers lodged in my math book. It was a tough slog, but I learned those tables real goooood. Mr. Ben-Naim drives home the proper scientific definitions by a similar methodology.However, he seems to take umbrage at the fact that “entropy” has several definitions (as do most words in the English language). The reason being that, by convention, with similar concepts, words are simply re-deployed as a semantic convenience, and we are to understand their meaning by the context.Entropy is a state function by strict scientific orthodoxy (Def.#1). But the change from an equilibrium state A to equilibrium state B … is also named “entropy” … by convention, i. e. entropy as an active ‘progression’ as well as a ‘state’ of equilibrium.So, when we say that entropy always increases, we understand what is meant by Def.#2 - increasing disorder. And … Def.#3 … diminution of a potential gradient (gravitational, electromagnetic or nuclear). And … Def.#4 … loss of information during transmission (as through a telephone line). Et cetera.I do agree that a system MUST BE DEFINED to calculate its technical entropy, and that the universe as a whole is not, as yet, properly defined. So, saying “We’re all gonna’ die” … from entropy … is scientifically unfounded - possibly silly. Nor does it define “time’s arrow”.E.B. (Author: “Ex Nihilo - The Logic of Existence”)
M**E
Tells what entropy is NOT, which is more enlightening than much hyperbole of what it supposedly IS
As the author of the novel “Entropy”, under the pen name Dana Hayward, I quite naturally did due diligence in researching various definitions of entropy in the field of physics before writing my book. It was during this incursion into a field completely outside of my scope of expertise that I struck up a friendship with Arieh Ben-Naim, the author of “Entropy: The Greatest Blunder in the History of Science”. I was greatly influenced by his many books on the subject, and was led to understand through his eyes an elusive concept that few outside the realm of science even contemplate. In all honesty, I learned more of what entropy is NOT from Dr. Ben-Naim, yet I did not necessarily learn what it IS through any other expert that I consulted.In this context, I have found the negative reviews of Ben-Naim’s book disconcerting. His writings pointed out to me misuses in the use of the term by those who should know better. Through his many books on the subject, of which I have three, I was given a glimpse of the limited definition of the notion of entropy by his examples of what it is not.I am a psychologist by training, and I take no offense by laymen misusing psychological terminology, but I do when a fellow colleague does. Case in point, in Catalonia, where I lived and practiced for over a decade, the word “psychosis” is used by the general public to describe a generalized obsession or frenzy, such as the psychosis to own a pair of Nikes. If a psychologist were to misuse the term in this manner, I would undoubtedly object.So why should Professor Ben-Naim be accused of “an ax to grind” when he has taken on the mission to set straight a great deal of loose applications of a very specific term in science? If these misuses were limited to novelist like myself, where is the harm? But if they are eminent scientists who spin a yarn that confuses and misleads the unsuspecting nonscientist, well I for one am grateful for the map that Ben-Naim has provided to guide me through the minefield of popularized science.
Z**K
Self-aggrandizing but thought provoking
The technical word “entropy” has unfortunately leaked from science into vernacular and is being used by all and sundry in whatever sense they see fit to use it. The author is right in pointing out that lots of nonsensical written drivel results, with some weird and mysterious meanings ascribed to what should be a precisely defined concept. The verbiage comes most frequently from liberal arts and pseudo-philosophical sources, but unfortunately also from popular-scientific books sometimes even written by well-known scientists, who, in heroic attempt to dumb down their subject to the level that can be described without formulas, borrow from this hazy and ill-defined vocabulary.Even in science, though, there is a lot confusion between entropy in thermodynamics, statistical physics and information theory.Enter prof. Ariel Ben-Nair who in a style of pronouncements from a sermon on a mount introduces a new, idiosyncratic, entropy definition to add to already existing ones by Clausius and Boltzmann, pronounces it superior to all, and calls it modestly an “ABN” or “Ariel ben-Nair” definition. The good professor should be reminded it is the grateful posterity that assigns the names of creators to their formulas. Author also supplies us with a bunch of dogmas, like “entropy is not a function of time and does not change” that seem to go contrary to accepted science, but actually are not incorrect when interpreted according to author’s many private definitions. In the example above author insists that entropy is only defined as a state function in equilibrium. When the system is away from equilibrium the entropy simply does not exist. So it is admitted that entropy in a new equilibrium state of an isolated system is higher than in old equilibrium state before, say, removal of restrains, but technically it never changes during time when it is defined. (What changes in the meantime is what author calls the SMI, or “Shannon Measure of Information” which for anybody else would eerily resemble the Boltzmann H function)Having defined everything in a way nobody else defines it, prof ben-Nair then disses everybody from Feynman to Hawking to Penrose for not adhering to this pettifogging pedantry. Since obviously nobody is good enough, the only books prof. ben-Nair recommends as reference throughout are his own.Having said all that, I actually read the book from beginning to end twice, including the appendices from the web portal, and the book got me thinking. Somehow I got satisfaction from watching this particular train wreck, so I give it four stars.
A**O
Detallado
Muy interesante e instructivo
A**R
A scientific review of misinterpretations of Entropy as expressed by renowned scientists.
The book, by Prof Arieh Ben-Naim on the enigmatic topic of entropy, is written in a clear and accessible language. The avid readers of literature on popular science as well as serious readers of textbooks on thermodynamics will find the book interesting and an eye-opener. The bestselling popular science literature, by renowned authors, in flowery language cast the spell on the readers making readers lose their ability to analyze the content and seek the truth. This is evident from the unfair reviews available. Such books on occasions indulge in misinterpretation and distortion of the topics like entropy and the Second Law. In this regard a careful comparison of the two best sellers “A Brief History of Time by S Hawking” and “A Briefer History of Time by S. Hawking and Mlodinow” will be interesting and an eye-opener. The comparison reveals that the latter is devoid of many topics mentioned in the former. The latter does not give any explanation on this omission and most probably leaves it to readers.The book by Prof. Ben-Naim challenges and reviews the veracity of misinterpretations and popular statements related to entropy and the Second Law. In short, the book discusses the blunders made by renowned authors.Fortunately, the reading and appreciation of the book do not call for a rigorous understanding of mathematics, physics, and chemistry but definitely call for an open mind. Some background in thermodynamics and statistical mechanics is helpful to understand this book. The calculations mentioned on occasions are simple and easy to comprehend. The serious and more inquisitive reader can definitely refer to appendices and other works of the author. The book begins with a chapter devoted to three definitions of entropy: The Clausius’, the Boltzmann’s, and the definition proposed by the author is referred to as ABN’s. It is shown that these definitions are equivalent and reviews mentioned in the book are based on the above definitions. However, the detail on how these definitions are equivalent is not the subject of this book. These details are available in the author’s other books. In the course of reading it is clear that there is nothing like maximum entropy or minimum entropy or negative entropy as is usually seen in the literature. What the author calls entropy is what Clausius called entropy and what Boltzmann and Gibbs called entropy, and all these are equivalent to ABN entropy. The author further logically counters that Shannon Measure of Information (SMI) is not entropy: the belief expressed by many scientists.It is worth noting that the book is founded on three definitions of entropy starting with Clausius, then Boltzman and then ABN. In the entire book, the author does not use or discover or invent any different type of entropy not given in textbooks of thermodynamics. The author did not change the definition of entropy! As it is explained in the introduction of the book, the main criticism is directed to those who use the concept of entropy (as defined in all textbooks), outside the “framework of its applicability”. Based on the three definitions of entropy, the book aptly criticizes several interpretations of entropy and several misapplications of entropy as well as the Second Law, presented by renowned scientist and authors, Some authors define entropy as a measure of disorder. Then they claim that the disorder of the universe always increases. This is not a definition of entropy, but a misinterpretation of entropy. The author rightly interprets entropy as a special case of SMI.The book shatters most of the misconception built around entropy and the Second Law and will prove very useful for inquisitive readers with open minds.
Trustpilot
1 week ago
2 weeks ago