Talk:Entropy/Archive 6

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Purpose of this page

IMPORTANT - If you wish to discuss or debate the validity of the concept of entropy or the second law of thermodynamics, in their numerous verbal and mathematical forms of presentation, please do so at talk.origins, True.Origins Archive or Wikireason. This "Discussion" page is only for discussion on how to improve the Wikipedia article. Any attempts at trolling, using this page as a soapbox, or making personal attacks may be deleted at any time.

Intro sentence

I've reverted Kenosis' suggestion Entropy describes the amount of energy in a give place at a given instant in time to the earlier version. I'd judge the sentence above to be misleading bordering on plain wrong. Which reference work does define entropy in this way? --Pjacobi 08:58, 24 October 2006 (UTC)

No Problem. Thank you to Pjacobi, Dave souza and Bduke. I'm sure the language will get sorted out in some reasonable way and find a consensus eventually. ... Kenosis 11:57, 24 October 2006 (UTC)
Taking into account Kenosis' concerns about starting with "while" and appearing to say what entropy relates to rather than what it is, I've added a line summarising the use of entropy which is already expanded further into the intro:
Entropy provides a measure of the amount of energy in a system that is unavailable to do thermodynamic work. While the concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy, the concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. In simple terms, entropy change is related to a change to a more "disordered" state at a Microstate level (an early visualisation of the motional energy of molecules), and to the idea of dispersal of energy or matter.
I've also qualified "disorder" as this seems to be the main source of misunderstandings. ..dave souza, talk 10:36, 24 October 2006 (UTC)

The qualification of "disorder" might clear up the misunderstandings that we are having, but the term "microstates" just adds a bunch of misunderstandings for newcomers to entropy. I suggest we need as little jargon as possible in the intro paragraph. --Bduke 10:59, 24 October 2006 (UTC)

Changing related to a change to a more "disordered" state perhaps tp
  • related to a change to a state of higher probability
  • related to a loss of information about the state
  • Or just dropping it there?
Having to use "" is always a bad sign.
Pjacobi 11:08, 24 October 2006 (UTC)
In the interim I've changed "microstate" to "microscopic" – "probability" and "information" are difficult to follow without the understanding that they are driven by energetic motion, dropping "disorder" from the first paragraph could be the best answer, but the prevalence of this idea means it has to be explained somewhere, ..dave souza, talk 11:19, 24 October 2006 (UTC)
I think the language in the introduction is headed in a more useful direction. The only issue I have at the moment is with the following:
  • "...the concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy."

Specifically, I think the words "and whether they occur spontaneously" and "[s]pontaneous changes occur with an increase in entropy" could be said more accurately than this. By my admittedly limited understanding, it seems to me that entropy might be more accurately described as an instantaneous quantification of an ongoing dynamic process rather than a spontaneous change per se. ... Kenosis 13:04, 24 October 2006 (UTC)

Again, I have to protest that this proposed wordings would be unintelligable to new comers. It is'nt even clear to me. Spontaneous change might not be completely accurate, but one version of the 2nd Law talks about heat flowing from hot to cold not the reverse and chemists use entropy to understand whether a reaction goes from left to right or the reverse. These are related to spontaneous change. I can not think of a better way to introduce why we are interested in entropy in the first place and to motivate people about understanding entropy. If anyone can, it would be most welcome, but this suggestion does not do so. I do think we should discuss changes here and see whether we can get consensus, rather than editing the article at this point.--Bduke 21:42, 24 October 2006 (UTC)

I would think this is not as big a problem as initially surmised here. Where heat entropy involves a heat source, there is a continuous flow of that energy through the surrounding medium or media (the surrounding physical structure or space). Entropy (S) is defined as a measurement of that energy in a given place or in a defined space, at a given instant in time. A change in this measurement is called Delta-S. Where a heat generating source (one that produces heat in excess of the rest of the defined space) is not involved, entropy within a defined space involves a movement of existing energy and/or particles towards a steady state, known as equlibrium. Surely this wording can be improved further such that it rings true to those familiar with the subject. No? ... Kenosis 05:08, 25 October 2006 (UTC) Where particle dispersal is involved, I figure the chemists here can explain the idea somehow, such that a previously uninformed, reasonably intelligent, English-speaking reader may be able to get a rough idea of the concept before moving on to the formulas. ... Kenosis 05:33, 25 October 2006 (UTC)
Do you have a source that discusses this? How can entropy (Joule/Kelvin) be a measurement of energy (Joule)? This is still unclear to me. Is it just that it is an unfamiliar idea to chemists? --Bduke 06:18, 25 October 2006 (UTC)
I immediately see you're probably right Bduke. Yes, I alluded to only one component of the state function S in my attempt above. You've just mentioned the denominator, and thanks for that. Please work with me on this a bit more, as I've already made clear my understanding is limited and that this is not my field. Where I am coming from on this is that I believe the wording can still be better yet, more explanatory to the unfamiliar reader, though if I had a clear answer to how I'd already have put it forward and perhaps already received some degree of acceptance for such proposed alternate language. When I read the words "whether spontaneous change occurs", my tendency as a reader is to visualize something changing suddenly. And I also read it as saying there's a question of some kind. At my present level of understanding to date, I can readily get the idea that this clause probably refers to some kind of assessment whether energy is, shall we say "passing through", "spreading" or otherwise dispersing at a given instant in time, or whether it is at equilibrium. I respect that if there's no better way of saying it than the way you've chosen, then we'll end up living with that current language, I would think, and I'll indeed help back it up as a participant. But most importantly, Bduke, thanks so very much for helping to bring this article forward under knowledgeable leadership. ... Kenosis 17:26, 25 October 2006 (UTC)
The point about entropy being Joule/Kelvin is important, and perhaps could be emphasised in the first paragraph. The distinction between dispersal of energy and dispersion of matter has been questioned here, and both points could be addressed by concluding the first paragraph with "the dispersion of energy, or of energy and matter, at a given temperature". This would then relate it to the second paragraph about the quantitative approach: would this be the best phrasing? ...dave souza, talk 16:53, 25 October 2006 (UTC)
I think this is definitely going in the right direction. The only objection I have is the phrase "In simple terms, entropy change is...". This is not strictly true and I want to say "In somewhat overly simple terms, entropy change is ...", but although its true it seems too negative. Is there a way of saying that the "disorder" and "spreading" ideas are good hooks to get into the subject, but should not be absolutely relied upon as one's knowledge progresses? PAR 19:00, 25 October 2006 (UTC)
How about the words "Thermodynamic entropy provides a comparative measure of the amount of internal energy in a substance." (I picked up some variation on this from FrankLambert, I think.) Might be a bit clearer to a beginner that hasn't read about entropy before? ... Kenosis 19:41, 25 October 2006 (UTC)
Clearer but wrong. --Pjacobi 20:10, 25 October 2006 (UTC)
Hah! Never mind that one then. Admittedly it doesn't cover the normal route. Energy divided by temperature literally is a comparative measure of the amount of internal energy in the process of spreading or dispersal, is it not? And graphite with 5.7 J/K mol has not had more energy dispersed to it from 0 K than diamond with 2.5 J/K mol?... Kenosis 20:18, 25 October 2006 (UTC)
As I've stated before, disorder or chaos are not applicable. Those two concepts need to be tossed when discussing thermodynamic entropy. To repeat, water at 273K as no more "ordered" than water at at 373K, conversely water at 373K is not chaotic or disordered in relation to water at 273K -- both are "ordered" in a way appropriate for their respective states (the same is true for each "temperature point" (infinitesimal) on the way from 273 to 373). That humans see boiling water as chaotic, and ice as ordered speaks far more to the human need for imposing the human concept of order on things than it does to any reality of the natural phenomenon. •Jim62sch• 21:02, 25 October 2006 (UTC)

OK, there is a lot here. You guys work while I sleep in Australia. I'll try to address several points, but let me first stress that while I have taught entropy to chemists as far as 1st year General Chemistry and 2nd year Physical Chemistry, I am not an expert in thermodynamics. We need an experienced expert who has researched thermodynamics and taught it, preferably to people other than physics students, who can learn it easily using mathematical approaches. Is spontaneous change a question, as Kemosis asks? I think in a sense it is. Does heat flow from a cold body to a hot body or only from a hot body to a cold one? If so, why? These are questions about the direction of spontaneous change which many people ask. The chemist wants to know whether a chemical equation, as written, proceeds from left to right or right to left. I think it is good idea to start from the questions people ask. To PAR, I respond that my original wording of an introductory paragraph finished with something like, "These are very simple approaches but to get more insight we need to go deeper". I.e we need to read on. Something like that might satisfy. Maybe Jim62sch is right in a technical sense, but it is certainly true that textbooks talk about entropy in terms of chaos and disorder. To "toss" them in this article would be original research and of no help to anybody wanting to know about entropy who is reading this article and text. Maybe Frank Lambert's approach will spread, but have a look at Peter Atkins, "Physical Chemistry", 8th edition, where it has spread somewhat but the terms disorder and, I think chaos, are still used. Peter is a very successful textbook writer and populariser of science. He is worth listening to. I would also add that if you look at a molecular level simulation of water at 373K and compare it with one at 273K it certainly looks more chaotic. I think Joule/Kelvin should be mentioned in the second paragraph afer S = q/T has been mentioned. I guess units put people off almost as much as maths does! Finally, a couple of thoughts. One, has anyone looked at entries in other encyclopedias? I only have one old one to hand. That talks about the idea of usefull work and also about disorder. Two, I think this article will be a real challenge to Citizendium. I do not think what they are doing will work and I am not intending to move over to there, but I do intend to challenge their experts to do a better job than we have done. I suspect that their experts will disagree too! Let us see if we can do better. --Bduke 23:26, 25 October 2006 (UTC)

I'm a physics grad student, and I think Bduke's suggestion of a sentence asking whether a process occurs spontaneously perfectly serves the purpose of an understandable, meaningful, and correct introduction. (On the other hand, talking about "the amount of energy in a system that is unavailable to do thermodynamic work" as it stands now, is less clear, in my opinion.) Yevgeny Kats 23:58, 25 October 2006 (UTC)
I agree about the point in brackets. That has worried me for a while. Could it moved to the end of the intro paragraph and say ".. to do work", with "work" linked to "thermodynamic work"? It would read better that way as it is simpler. --Bduke 01:21, 26 October 2006 (UTC)

To Bduke - yes, now I remember that sentence and I thought it was a good one. Here's an introductory paragraph that sort of restores it. PAR 02:52, 26 October 2006 (UTC)

Entropy provides a measure of the amount of energy in a system that is unavailable to do thermodynamic work. While the concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy, the concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. Entropy change may be thought of as a change to a more disordered state at a microscopic level. It may also be thought of as the dispersal of energy or matter. These are useful but simple approaches to the subject. A more complete understanding of entropy change requires a bit more study.

Having introduced the offending first sentence, I'll go along with its removal: it's a common simplification which suffers from the same sort of problems as "disorder" and is better put in the context of the present third paragraph, which could also do with review. To meet the concern about starting off with "While.. something else", would it be possible to rephrase the leading sentence? Perhaps along these lines:
The concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously, in the context of the concept of energy which is central to the first law of thermodynamics and deals with the conservation of energy.
There's more to the point made in that sentence than I appreciated at first, and I hope I'm not mangling it too much. ...dave souza, talk 03:16, 26 October 2006 (UTC)
A possible caution about the word "spontaneous". Spontaneous reaction is a term of art in chemistry, meaning a reaction with a negative change in Gibbs free energy, i.e. a positive change in total entropy. But a spontaneous reaction may or may not actually "spontaneously" occur in the evryday meaning of the word, depending on the height of any energy barrier. Just something people drafting this paragraph might like to be cautiously aware of at the back of their minds. Jheald 13:53, 26 October 2006 (UTC)

If we are starting to speculate what entropy is a measure of, we can as well use the formulation that the entropy difference is a measure of irreversibility. --Pjacobi 10:42, 26 October 2006 (UTC)


Correction of error

In Talk:Entropy of 23:26, 25 October, Bduke, a distinguished physical chemist and Wikipedia contributor and editor, erred seriously in two points. Quoting him (with my insertions of numbers), “[1] Maybe Frank Lambert's approach will spread, [2] but have a look at Atkins, "Physical Chemistry", 8th edition, where it has spread somewhat but the terms disorder and, I think chaos, are still used.”

In early October Bduke was not active in Talk:Entropy and thus undoubtedly was unaware of http://en.wikipedia.org/wiki/Talk:Entropy#Non-notable.3F of 9 October. Therein, his point [1], a questioning of the spread of an approach to entropy as the measure of the dispersal of energy/T in a process, is conclusively countered. That the majority of new editions of established US chemistry textbooks have adopted it within just three years of the publication of his principal articles is nothing less than astonishing support for the approach. Sales figures are guarded by publishers, but it is more probable than not that the majority of chemistry students in the US since Fall 2005 have been taught this approach.
Thus, the approach has already spread remarkably widely. It is not an individual’s POV. (Of course it may take a generation to reach all scientists and science texts. That is the common history of new concepts, is it not? But this approach already has been proved to have legs!)

Bduke’s second point [2] is a grave error perhaps because he just received his copy of Atkins’ “Physical Chemistry”, 8th edition and has had time only to scan it rapidly. My gift copy came in March and I know its comments re entropy well, especially in comparison to previous editions. The change is not “somewhat” but profound and thorough.

In prior editions (with his italics in the following quotes) , Atkins defined spontaneous changes as “accompanied by a dispersal of energy into a more disordered form.” The direction of change, he wrote, is one “that leads to the greater chaotic dispersal of the total energy…” “The spreading of the object’s energy into the surroundings as thermal motion is a natural consequence of chaos.” “[The thermodynamic definition of entropy] is motivated by the idea that a change in the extent to which energy is dispersed chaotically can be derived by noting the quantity of energy that is transferred as heat…” [sic]. “The definition of a spontaneous change and its interpretation [is] in terms of the tendency of an isolated system to achieve greater disorder.” There are some 27 uses of the words ‘order’ and ‘disorder’ in discussing particular applications of entropy change.
The foregoing is certainly an emphasis on disorder and chaos in interpreting entropy, even when the dispersal of energy is associated with entropy change. Contrast this to the 8th edition with its numerous clear statements about entropy, entropy change, and the dispersal of energy, never using ‘disorder’ in connection with a definition of entropy. (The word ‘chaos’ does not appear in the Second Law chapter. I visually checked and computer word-searched. The words ‘order’ or ‘disorderly’ appear a total of 4 times.)
p. 77: “Can it be…that the direction of change is related to the distribution of energy?...this idea is the key…spontaneous changes are always accompanied by a dispersal of energy.” “…the signpost of spontaneous change: we look for the direction of change that leads to dispersal of the total energy of the isolated system.” p. 78: “…in due course, we shall see that dispersal of energy and matter accounts for change…” “We shall see that the entropy…is a measure of the energy dispersed in a process…”The thermodynamic definition of entropy …is motivated by the idea that a change in the extent to which energy is dispersed depends on how much energy is transferred as heat.”
p. 81 “The concept of the number of microstates makes quantitative the ill-defined qualitative concepts of ‘disorder’ and ‘dispersal of matter and energy’ that are used widely to introduce the concept of entropy” [[Note: This quote appears to put ‘disorder’ on an equal footing with the qualitative ‘dispersal of matter and energy’. However, Atkins consistently uses ‘dispersal of energy’ or of ‘energy and matter’ in his qualitative definitions of entropy change whereas he has discarded that use of ‘disorder’ in defining entropy here in this edition.]] p. 88: “[re phase change] This decrease in entropy is consistent with localization of matter and energy that accompanies the formation of a solid from a liquid or a liquid from a gas….If the transition is endothermic [as in melting, vaporization] the entropy change is positive, which is consistent with dispersal of energy and matter in the system.” p. 89: “…Trouton’s rule …strong molecular interaction…As a result there is a greater dispersal of energy and matter when the liquid turns into a vapour than would occur for a liquid in which molecular motion is less restricted.”

For over twenty years Atkins has used the phrase “energy dispersal” in connection with entropy change and entropy increase but always in previous books and text editions, it has involved “to disorder” or “to chaos”. Only in this new 8th edition has he deleted the connection of energy dispersal to ‘disorder”, consistently repeated that entropy change is related to energy dispersal, and associated energy dispersal with a greater number of microstates. This has been my emphasis since my original publications in 2002. However, I have also emphasized giving the beginning student (who may not be able to handle abstract concepts such as microstates) a thorough qualitative interpretation of all kinds of entropy change in terms of energy dispersal: http://en.wikipedia.org/wiki/Talk:Entropy/Archive4#Spontaneous_entropy_increase_always_involves_the_dispersal_of_energy and http://en.wikipedia.org/wiki/Talk:Entropy#The_.E2.80.9CEntropy_of_Mixing_.E2.80.9C FrankLambert 05:52, 27 October 2006 (UTC)

User:Kenosis has recently started a duplicate article on entropy; administrator User:W.marsh recently put merge tags from Introduction to entropy to entropy. Kenosis reverted this? Having duplicate pages of the same topic is a bad idea, it creates a big mess with many problems and with people unknowingly editing on separate pages. For example, gravity and gravitation were separate for almost a year, this created many redundant edits, and it was a big pain to merge these. The Introduction to entropy page is almost duplicate to the entropy (energy dispersal) page; I suggest that it be merged there. Any objections? --Sadi Carnot 08:55, 28 October 2006 (UTC)

We just discussed this issue on this talk page within the past couple of weeks. User:W.marsh obviously hadn't read the discussion, and evidently covers a very wide swath of Wikipedia, unable to become intimately familiar with many of the topics (s)he runs across. That he or she is an admin is irrelevant. ... Kenosis 16:01, 28 October 2006 (UTC)
It's early days for both articles: I've made some edits to the Introduction to entropy moving it towards the aim expressed in earlier discussions of providing an accessible approach to entropy for those who find this entropy article too technical. The entropy (energy dispersal) page has value as an account of development of teaching methods. None of this should detract from the need to make this article accessible to beginners: the recent edits to the intro look useful but once again introduce highly technical terms at the outset: I'd be interested in Bduke's views on this as a start to the article. ..dave souza, talk 09:50, 28 October 2006 (UTC)
No merge. Allow the articles to develop and then we'll talk. While this article has improved somewhat, it is still flawed. And, given the hostility some here have shown to various aspects as covered on the other two articles, I think it best if those articles are left to develop before another pointless edit war erupts here. •Jim62sch• 11:55, 28 October 2006 (UTC)

I wasn't aware that this "intro" side article approach was part of a introductory science article series. It seems neither was User:W.marsh. Someone should make a template for all the talk pages in this series, stating that these articles are part of a series, so to avoid further confusion. Also, I would obtain from using the words "teaching approach"; isn't the point of Wikipedia to teach people? Isn’t the point of writing about entropy in any article or book to teach people? Also, I would suggest to that the intro article be written from a variety of sources and perspectives (such as Van Ness' Understanding Thermodynamics [103 pgs], Mahan's Elementary Chemical Thermodynamics [153 pgs.], or Schroeder's Introduction to Thermal Physics [396 pgs]); the current article seems to be the "introduction of entropy" according to Frank Lambert. It is not my concern, but his website has many errors, inaccuracies, and is written in a manner to disparage Gibbs, Boltzmann, Helmholtz, and others, thus giving the newcomer the wrong perspective. Thus I would be cautious in using it as the main source and try to use textbooks and classic articles as main sources and use Lambert as a third or fourth source. Later: --Sadi Carnot 13:28, 28 October 2006 (UTC)

There is no "series" of introductory articles per se. Most of the topic forks split into introductory articles were independently developed by separate sets of editors. A number of these are organized into a list and became part of our discussion merely to illustrate that there is ample precedent for the approach. ... Kenosis 16:10, 28 October 2006 (UTC)
And, kindly give this new article reasonable time to develop. Asserting that it should be solely focused on traditional expressions of the thermodynamic usage of "entropy" unduly prejudices some of the central issues that article is intended to address at an introductory level. In due course, other citations will make their way into that article to achieve a wider balance of sources involving fields other than chemistry. ... Kenosis 16:17, 28 October 2006 (UTC)
Would you be willing to point out the errors, and to back up why they are errors? Are you aware that the 8th edition of Atkins uses Lambert's ideas to explain entropy, rather than any (to me) nonsense about chaos and disorder? Did you know that the chaos/disrder defiition is losing steam (no pun), and is furthermore behind the foolish claims of Creationists that entropy means that life could not have evolved? •Jim62sch• 19:35, 28 October 2006 (UTC)

Folks, my point is that articles need to have a balanced representation. To think that the entire world over solely reads Atkins is a naïve perspective. Yes, of course, there are many that do read Atkins, but there are hundreds of textbook authors every year who publish chapters containing thoughts on entropy. Two weeks ago, for example, I bought the new 2006 encyclopedia Encarta premium edition, its entry for entropy (physics) states the following:

From the second law, it follows that in an isolated system (one that has no interactions with the surroundings) internal portions at different temperatures will always adjust to a single uniform temperature and thus produce equilibrium. This can also be applied to other internal properties that may be different initially. If milk is poured into a cup of coffee, for example, the two substances will continue to mix until they are inseparable and can no longer be differentiated. Thus, an initial separate or ordered state is turned into a mixed or disordered state. These ideas can be expressed by a thermodynamic property, called the entropy (first formulated by Clausius), which serves as a measure of how close a system is to equilibrium—that is, to perfect internal disorder. The entropy of an isolated system, and of the universe as a whole, can only increase, and when equilibrium is eventually reached, no more internal change of any form is possible. Applied to the universe as a whole, this principle suggests that eventually all temperature in space becomes uniform, resulting in the so-called heat death of the universe.

Locally, the entropy can be lowered by external action. This applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, and to living organisms. This local increase in order is, however, only possible at the expense of an entropy increase in the surroundings; here more disorder must be created.

This continued increase in entropy is related to the observed nonreversibility of macroscopic processes. If a process were spontaneously reversible—that is, if, after undergoing a process, both it and all the surroundings could be brought back to their initial state—the entropy would remain constant in violation of the second law. While this is true for macroscopic processes, and therefore corresponds to daily experience, it does not apply to microscopic processes, which are believed to be reversible. Thus, chemical reactions between individual molecules are not governed by the second law, which applies only to macroscopic ensembles.

From the promulgation of the second law, thermodynamics went on to other advances and applications in physics, chemistry, and engineering. Most chemical engineering, all power-plant engineering, and air-conditioning and low-temperature physics are just a few of the fields that owe their theoretical basis to thermodynamics and to the subsequent achievements of such scientists as Maxwell, the American physicist J. Willard Gibbs, the German physical chemist Walther Hermann Nernst, and the Norwegian-born American chemist Lars Onsager. Microsoft ® Encarta ® 2006. © 1993-2005 Microsoft Corporation. All rights reserved.

Now, there are subtle errors in this paragraph as well, e.g. equilibrium = perfect internal disorder (not exactly correct), and it of course needs a good cleaning, but the point of showing it here is to highlight the fact that the disorder perspective is commonly used, whether we like it or not. It is not the aim of Wikipedia to censor well-established perspectives, just because we don't like them. The concept of "changes in the ordering of molecules with heat transfer" traces all the way back to the writing of Clausius (and possibly others before him), e.g. see disgregation. Building on the writings of Clausius there was Maxwell. In 1859, after reading a paper on the diffusion of molecules by Rudolf Clausius, Scottish physicist James Clerk Maxwell formulated the Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first-ever statistical law in physics. Five years later, in 1864, Ludwig Boltzmann, a young student in Vienna, came across Maxwell’s paper and was so inspired by it that he spent much of his long and distinguished life developing the subject further. Gibbs and others also built on the writings Clausius. Lewis, Randall, Guggenheim, and many others built on the writings of Gibbs. To try to argue that the "ordering of molecules" (or conversely the "dis-ordering" of molecules) is now an irrelevant perspective is akin to throwing the entire history of thermodynamics as well as its founders out the window. --Sadi Carnot 15:57, 29 October 2006 (UTC)

Thanks, Sadi, for an interesting example of the "disorder" perspective you favour. I too consider that there should be no censorship of verified viewpoints, and want the article to properly present such viewpoints in a way that the ordinary reader can follow. Your enthusiasm for perspectives pre 1903 is appreciated, but newer views should not be excluded, especially when they give a clearer introductory explanation. Of course Encarta is not infallible and your disagreement in detail with their definition is noted, but it's interesting that they essentially seem to equate uniformity and equilibrium with disorder. My dictionary defines disorder as "want of order, confusion, disturbance, breach of the peace, disease" so it's no wonder many people find the term confusing. Unfortunately I've been unable to find any clarification of this unusul understanding of the term in our article: perhaps you could point it out, or add a suitable explanation? ...dave souza, talk 17:56, 29 October 2006 (UTC)
Boltzmann seems to introduce the topic on pg. 40 (Lectures on Gas Theory, Dover Ed.) where he introduces the terms molar-ordered and molar-disordered distributions of molecules. P.S. I don't favor anything except a historically correct representation of information. Out of time for today. Talk later: --Sadi Carnot 18:49, 29 October 2006 (UTC)
That sounds interesting and useful for an understanding of what he meant at the time. Don't forget that to be useful nowadays as an introductory analogy it has to be related to modern science in a way that can be understood by a non-mathematician. .... dave souza, talk 23:03, 29 October 2006 (UTC)
I'm guessing that basically what your looking for is a discussion of the classic "entropy order/disorder evolution paradox", i.e. why, according to the second law, if entropy tends to increase, which according to Boltzmann correlates to order, does evolution (an ordering process) occur?--Sadi Carnot 09:21, 30 October 2006 (UTC)
No, what I'd like the article to include is a clear explanation of what "disorder" means as an introduction for beginners in the subject, and it will also help if Boltzmann's intended meaning can be clarified in its historical context and in the light of later scientific discoveries – a brief clarification of this can go here, more detail would go in related articles. ..dave souza, talk 15:15, 30 October 2006 (UTC)
If you can get hold of a copy of Callen (1985) "Thermodynamics and an Introduction to Thermostatistics", there is an entire chapter on the subject. Chapter 17 is entitled "Entropy and Disorder: Generalized Canonical Formulations", with three subsections entitled:
  • Entropy as a Measure of Disorder
  • Distributions of Maximal Disorder
  • The Grand Canonical Formalism
The concept of disorder is precisely defined with 5 quantitative properties given, and it is shown to be equivalent to thermodynamic entropy (to within a constant). I disagree with the development, because the definition of disorder is too conveniently close to the definition of entropy, so the conclusion is foregone. Which is, by the way, the same objection I have to Leff's definition of dispersal - when applied to energy it is FAR too conveniently close to a definition of entropy, so the conclusion that entropy change is energy dispersal is foregone. As a qualitative introduction, both viewpoints are useful, but with deeper analysis both become a hindrance if taken literally. PAR 17:12, 30 October 2006 (UTC)
Given the highly articulate writing and educated background of some of the involved editors here, my offhand sense is that it should be quite possible to put these characterizations in perspective for the reader of the WP article. The 20th Century use of "disorder," which began with Boltzmann, is plainly a pre-quantum-mechanics view that remains meaningful to chemists in particular and also to others who have a perspective of what happens at the microscopic level. As Bduke put it above, "...if you look at a molecular level simulation of water at 373K and compare it with one at 273K it certainly looks more chaotic." It appears there remains a gap in this characterization when one begins trying to communicate the concept of the spread of "disorder" from a heat source throughout a medium, and into or through the adjacent or surrounding media (or vice versa in the case of a negative delta-S). The newer approach using the concept of "dispersal of energy" appears to explain this latter part of the phenomenon quite sensibly, as it applies to both a microscopic and macroscopic perspective. And since more textbooks are choosing to use the descriptor "dispersal of energy", it would seem to deserve conspicuous mention, in its proper persective, right alongside the more traditional approach. It seems it would require only a very brief note of some kind to the reader of the WP article, perhaps in a footnote, to keep these two approaches in proper perspective as to their relative current levels of acceptance and influence in accordance with WP:NPOV#undue_weight. ... Kenosis 17:40, 30 October 2006 (UTC)

Thanks to Sadi Carnot

Sadi Carnot indeed has an extensive library in thermodynamics and clearly is a diligent scholar. (Note the volumes, as you scroll down to his remarkable educational background at http://www.humanthermodynamics.com/Libb-Thims.html).

Therefore, I am truly sorry that he, in words similar to those that a successful text author in 2003 wrote to me, believes that I disparage Gibbs et al. It further grieves me that Carnot says that I have 'many errors' in my writing. Unfortunately my ideas have been adopted by the majority of new editions of US first year chemistry texts for chemistry majors -- including the new 2006 edition by the author who claimed that I had disparaged Gibbs! (See http://en.wikipedia.org/wiki/Talk:Entropy/Archive5#Non-notable.3F )
I wish that Carnot also had a copy of Atkins' 8th in his library, the worldwide pre-eminent physical chemistry text whose conceptual definition of entropy is now uniformly in terms of the dispersal of energy, very similar to those that I have employed since 2002. (http://en.wikipedia.org/wiki/Talk:Entropy#Correction_of_error) Carnot is kind in advising others to use me as a 'third or fourth source' and I'll try to improve by further study about the do-deca-bond for humans that Carnot has ingeniously devised to explain one of my major problems: http://www.humanthermodynamics.com/HT-Glossary.html#anchor_109 . FrankLambert 01:28, 29 October 2006 (UTC)
I don’t know if your comments are made as puns, but whatever the case I have heard every variety of negative and or positive comment one can imagine. Atkins’ 8th Ed. Physical Chemistry (2006) is $162 (used); I rarely order books over $100. If you can suggest a cheaper version, by possibly another author, I might be inclined to order it. Regarding entropy = energy dispersal, I am going to gauge, based on my reading experience, that less than 10% of the scientific community sees entropy in terms of energy dispersal. It is mostly found in very beginner introductory books. In Andrew Scott’s 2001 introductory book 101 Key Ideas – Chemistry (100pgs), for example, he has one page (pg. 41) devoted to entropy, where he states: “Entropy can be defined in strict mathematical terms, but we can understand it simply by appreciating that increasing entropy is associated with the automatic dispersal of energy and matter.” He says that “chemical reactions occur because energy and matter move towards more dispersed arrangements.” He concludes “the chemistry that lets plants grow would never happen on a cold dark earth with no sun but a it is made to happen when the chemicals in plants have the energy of sunlight continually dispersing through them.” Now from what I’ve read this is basically Atkins concept derived for beginners. Now there are many shades of buried close truthfulness here, yet this paragraph is so vague, e.g. isn’t everything in the universe “energy and matter” (does that mean that everything that spreads relates to entropy), and so far removed from the study of the Carnot cycle (e.g. the work that the molecules of the working substance do on each other), and has so many arguable points of consistency in it that any learned scientist could poke holes in this all day long, e.g. reactions occur because energy and matter like dispersed arraignments (this latter statement is almost backwards). We should try to limit the amount of this type of writing that goes into Wikipedia so establish our credibility. The essence of entropy is about the work energy associated with irreversibility in engine cycles. One must always be cautious about watering this essential basis down so much so that it is not even the original topic anymore. --Sadi Carnot 16:49, 29 October 2006 (UTC)
It would appear to be a rapidly growing approach, which seems like it is in some part due to Frank Lambert's work. Consistently with WP:NPOV#Undue_weight, this approach appears to deserve inclusion, appropriately qualified that it is currenly a minority position but that it is increasingly used within the past several years. (This explanatory approach appears to be making very rapid progress towards acceptance. Atkins does also appear to take a very wide view of the range of phenomena that are reasonably termed entropy-- a long, long way from the original "waste" energy for sure, though plainly it can still be described in terms of q/t. Given enough strategically located measurements, it seems like it could apply just about anywhere.) ... Kenosis 17:10, 29 October 2006 (UTC)
This approach has a representative inclusion here: Entropy#Energy dispersal. Isn't this a done deal? Let's move on to something else … like making a new diagram, what entropy diagram would people like to see? I think that the "Two containers of gas connected via a stopper" is very famous. --Sadi Carnot 17:28, 29 October 2006 (UTC)
Is this a call for disclosure of used literature? I for one use or plan to use the Boltzmann selected papers from Ostwalds Klassiker der exakten Wissenschaften, Sommerfeld book, the Pauli lectures, the Fermi booklet, Becker's Theorie der Wärme and occasionally the Landau/Lifshitz series. Do you think I'm missing a relevant perspective?
Just checked: seems to be still the required reading in typical German university course, see [1]. Hmm, perhaps I should add JD Fast, Entropie to my library. Bad luck for you guys, I've taken the only used one available at amazon. I'll report back later, whether it was worth the EUR 7.95.
Pjacobi 17:35, 29 October 2006 (UTC)
I think JD Fast is worth the money, it's a good book and it is possibly an original source for the rubber band stretching entropy change example, which I see around so many places. Later: --Sadi Carnot 18:30, 29 October 2006 (UTC)

Regarding S = Sadi symbol assignment (speculation)

As to the following paragraph:

Although Clausius did not specify why he choose the symbol "S" to represent entropy, it is arguable that Clausius choose "S" in honor of S. Carnot, i.e. Sadi Carnot[citation needed], to whose 1824 article Clausius devoted over 15 years worth of work and research on. From the first page of his original 1850 article "On the Motive Power of Heat, and on the Laws which can be Deduced from it for the Theory of Heat", Clausius calls "S. Carnot" the most important of the researchers in the theory of heat.

I brought this to discuss the fact tag. It may be in 400+ pages of works and or notes of Clausius somewhere, but I have only limited access to these. This seems to be an etymological issue, similar to the origin of the word "chemistry", e.g. see: Talk:Chemistry#Core etymology to intro. I think finding the actual mental note by Clausius on his assignment of "S" will be difficult. Yet it can be easily argued that, just as G = Gibbs, J = Joule, K = Kelvin, etc., that Clausius choose S = Sadi (first name by default since C = heat capacity had already been in use since the late 18th century by those as Joseph Black). Does any have a problem with me removing the [citation needed] tag and replacing it with an argument footnote # at the bottom of the page? Then later, when someone finds a better explanation or an exact source, we can amend this. --Sadi Carnot 17:18, 29 October 2006 (UTC)

I will amend this as there seem to be no objections; when someone finds a better source we can amend this further. --Sadi Carnot 10:53, 31 October 2006 (UTC)

Moved to: Talk:Entropy (order and disorder)

Future article tasks

Hi, I added a new pic (thermal-solar-system), two new references; later I would like to add the "entropy balance" equation for open systems, i.e. those that exchange heat, work, and mass with their surroundings. Lastly, I collected the following stats to help clarify the prevalence of terms via Google search results, as an outline for future work on the entropy article:

  1. entropy energy time – 6,490,000 results
  2. entropy energy order – 5,310,000 results
  3. entropy energy information – 5,070,000 results
  4. entropy energy life - 1,670,000
  5. entropy energy chaos – 1,050,000 results
  6. entropy energy disorder – 694,000 results
  7. entropy energy dispersion – 639,000 results
  8. entropy energy dissipation – 503,000 results
  9. entropy energy irreversibility – 164,000 results
  10. entropy energy dispersal – 63,000 results
  11. entropy energy disgregation – 88 results

If anyone knows of other related search terms, please add them to the list. Thanks: --Sadi Carnot 18:05, 5 November 2006 (UTC)

Seems to me you're placing too much credence in Google results, the number of hits is is irrelevant as a measure of determining validity. Astrology, for example, returns 3,540,000 hits. Additionally, 5 and 6 are functionally the same, as are 7, 8, 10 and 11. •Jim62sch• 19:24, 5 November 2006 (UTC)
Yes, thanks Jim62. I know you favor #10. Myself, I favor all of them. In addition, each term has a history all its own, e.g. Ilya Prigogine's 1970's dissipative structure theory traces back to William Thomson's 1850 dissipation of mechancial energy paper, which traces back to others before him. You have to read each term in the context of the paper in which they originally developed and as they are used currently. I put the list here, as a “loose outline”, so to make sure that each topic gets a balanced representation in the article. Later: --Sadi Carnot 14:48, 6 November 2006 (UTC)
It's not a matter of what I favour, it's a matter of sloppy research. •Jim62sch• 22:11, 6 November 2006 (UTC)

While I appreciate Sadi's enthusiasm for Google popularity contests and for the detailed history of the subject, the article at the moment is overloaded with outdated equations and historical developments, and the last thing it needs is a checklist of terms to be included on the basis of how common they are on the internet. The Undergraduate students' understandings of entropy and Gibbs Free energy pdf clearly shows the disadvantage of using "disorder" and "chaos" to describe entropy, and while they should be mentioned, the focus of the article on explaining entropy needs to be improved. Having said that, we have entropy "defined as a thermodynamic property which serves as a measure of how close a system is to equilibrium—that is, to perfect internal disorder" without any clarification as to how disorder equates to equilibrium, but maybe that's an impossible task and hardly worthwhile in view of the clear disadvantages of using disorder and chaos in descriptions. ... dave souza, talk 22:27, 6 November 2006 (UTC)

Well of course equilibrium is perfect internal disorder -- can't you see that? No? Neither can anyone capable of thinking outside the box. Parroting age-old bullshit because it fits in nicely with the human need to find order in the universe is not a virtue, it is a stultification of the intellect. •Jim62sch• 22:33, 6 November 2006 (UTC)
Jim62, please be nice. Dave souza, thanks for the article, I printed it out. The Encarta definition simply shows their perspective. Encarta, by the way, is supposedly the world's most utilized encyclopedia; although Britannica is much better. --Sadi Carnot 03:28, 7 November 2006 (UTC)
Well, if certain statements in the article that offend my intelligence are better phrased or corrected I shall be very nice. Problem is, while the current course of the article is a definite improvement, there are still too many things that are relics of 150 year old misunderstandings. But, please keep in mind that I'm refering to certain statements in the article and to the ongoing misconceptions, not to any editor.
Agreed for the most part on Britannica. •Jim62sch• 22:11, 7 November 2006 (UTC)
Jim62, try reading any brand new thermodynamics textbook and you might walk away with a different perspective. Later: --Sadi Carnot 01:30, 8 November 2006 (UTC)
A different perspective what way? That teaching entropy as chaos/disorder has ceased to be done (or at least decreased) or that it is still rampant and still wrong? It's ironic that some have quibbled over the exact meaning of "dispersal", and yet have clung to a definition of disorder that equates to a messy room. Of course, given the prevailing senses of disorder, I suppose that can't be avoided, and the real issue is if the word disorder is even appropriate. •Jim62sch• 09:04, 9 November 2006 (UTC)

Molecular disorder

Jim62, today I bought a new 2005 thermodynamics textbook, it has the following definition:

  • Entropy - a quantity that refers to changes in the status quo of the system and is a measure of molecular disorder and the amount of wasted energy in a dynamical energy transformation from one state or form to another.
Source: Haddad, Wassim M. (2005). Thermodynamics - A Dynamical Systems Approach. Princeton University Press. ISBN 0691123276. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)

Later: --Sadi Carnot 10:22, 9 November 2006 (UTC)

And? Many textbooks still explain the "big bang" as just that, when it was in fact anything but. Many textbooks still explain gravity as if it were a "mysterious" force, and ignore the warping of space that makes gravity possible. Just because something is in a textbook does not grant it inherent validity. •Jim62sch• 00:18, 10 November 2006 (UTC)

How's this for an exact definition of disorder. It comes from Callen "Thermodynamics and an Introduction to Thermostatistics" which is the most referenced source on thermodynamics in the world, bar none.

The disorder in a given set of N numbers fj is measured by a disorder function D(f1,f2...fN)

  • The measure of disorder should be defined entirely in terms of the set of numbers (fj)
  • If any one of the fj is unity (and all the rest consequently are zero), the system is completely ordered. The quantitative measure of disorder should then be zero.
  • The maximum disorder corresponds to each fj being equal to 1/N
  • The maximum disorder should be an increasing function of N
  • The disorder should compound additively over "partial disorders." If D1 is the disorder in the first set of which there are N1 elements, and D2 is the disorder in the second set of which there are N2 elements, then the total disorder is

These assumptions are only obeyed by the function

where k is an arbitrary constant. For a thermodynamic system, take fj to be the probability that the j-th microstate is occupied by the system. Then, for a closed thermodynamic system, it is seen that the thermodynamic entropy corresponds to the maximum possible disorder in the distribution of the system over its permissible microstates.

Yes, nice. There's a forest out there you guys keep missing because of a few big, but gnarled trees. •Jim62sch• 02:15, 10 November 2006 (UTC)
Good work Par, I suggest that you add this, w/ the source, to the entropy (order and disorder) article and show how these formulas provide a quantitative measure of entropy for molecular systems. Later: --Sadi Carnot 11:24, 10 November 2006 (UTC)
The problem is that I don't like the above definition. It is just too convenient. The description of disorder is really just a disguised description of entropy. To say disorder by this definition is equivalent to entropy is redundant. One of the valid objections to using disorder as entropy is that disorder, like beauty, is in the mind of the beholder. When the above definition says "perfect order means all fj=0 except one of them" does not strike me as an obvious characteristic of disorder.
The reason I listed it is to show that it is not correct to say that the definition of disorder is vague or ill-defined.
Before the anti-disorder advocates get all happy, this is exactly the same objection I have to entropy as energy dispersal. Dr. Leff's article (which I will email to anyone who wants it) finally provides a quantitative description of energy dispersal which is just as precise as the above description of disorder - with the same flaws. It is a disguised definition of entropy, and equating it to entropy is redundant.
I think both descriptions are useful on a beginning level, but clinging to them while delving deeper into the meaning of entropy is counterproductive. I wish the article would reflect this. PAR 20:14, 10 November 2006 (UTC)
Agreed that both descriptions are aimed at a beginning level: the published studies indicate that "disorder" is counterproductive at that level, whether either is useful when delving deeper is at best unproven. ... dave souza, talk 09:20, 12 November 2006 (UTC)

Moved to Talk:Entropy and life

Why? If it effects what's in this particular article it should not be moved.
In any case, this sentence needs work, "Over the last century, as stemming from Clausius' 1863 memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action", much writing and research has been devoted to the relationship between thermodynamic entropy and the evolution of life." •Jim62sch• 21:48, 8 November 2006 (UTC)
I edited it based on the examples at the other talk page. •Jim62sch• 21:51, 8 November 2006 (UTC)
Jim62, good changes; I agree with you on the move, but for the sake organization I moved it. Having simultaneous talks going on two separate pages on the same topic is messy, e.g. I just replied to a comment by Pjacobi on the same issue, related to the changes you just made, on the other talk page. P.S. I'll be cuting back on my edit time over the next two months. Later: --Sadi Carnot 06:28, 9 November 2006 (UTC)

Ice melting

I'm unhappy with the teaser picture. Both ice melting and water freezing can occur spontanously and with increasing entropy. It all depends on the mixture and temperature of the ingredients. --Pjacobi 19:37, 8 November 2006 (UTC)

Maybe change the caption to "Ice melting in a warm room"? PAR 19:55, 8 November 2006 (UTC)
Ice melting, or solid/liquid/gas phase and structure change for water, is the most common entropy change example of all time. Ice melting and entropy change is in nearly every chemistry textbook as an example. Didn't you see the disgregation article I wrote just to accompany that picture? If, however, you know of more entropy change picture examples, feel free to add them to the article. P.S. it’s worded to match the exact wording that Clausius originally used. Later: --Sadi Carnot 06:36, 9 November 2006 (UTC)
I thought that Pjacobi was making the point that if you have a glass of ice and water and the ice is very cold, it could freeze the water, and still entropy would increase. Correct me if I am wrong, but I thought his problem could be solved by re-doing the caption, not the picture. PAR 20:14, 10 November 2006 (UTC)
That's what I meant. But only re-doing the capture wouldn't do anything against the preconception, that only melting, not freezing, can increase entropy. --Pjacobi 20:25, 10 November 2006 (UTC)
May I humbly suggest "In the system of ice and water, energy disperses from the water to the ice, reducing the temperature differential and increasing entropy." If you wanted to include the melting/freezing aspect this could be covered by "If the resulting equilibrium is above 273 K (0 C) the ice will have melted, if below that temperature the water will have frozen." Just an idea....dave souza, talk 09:08, 12 November 2006 (UTC)
That, or something similar, sounds fine to me. PAR 10:31, 12 November 2006 (UTC)

I would object to that phrasing. The "ice+water" is the system of interest, and the room is the surroundings. There is no temperature differential between ice and water. They are both at the same temperature. The temperature differential is between the room and the "ice+water" system.LeBofSportif 13:15, 12 November 2006 (UTC)

I think that scenario is being considered only as a possibility, but if it is always considered to be true, the caption on the picture should make that point. Either way, you make a good point - To be correct the first sentence above should read - "In the system of ice and water, energy disperses from the water to the ice, with a subsequent increase in entropy" (dropping the "temperature differential" part). The point that Pjacobi was making was to consider the possibility that the ice was colder than the water, in order to avoid the prejudice that ice melting in an ice/water system was always an increase in entropy. To point out that water freezing in the ice/water system could be an entropy increase as well. PAR 16:15, 12 November 2006 (UTC)
"Ice melting" - classic example of entropy increase described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.
A fair point which I was leaving to the more detailed paragraph below, and of course the temperatures in the "ice+water" system will depend on timing: a warmer drink will have sub-zero ice from the freezer added, then will tend towards zero. PAR's suggestion of removing the "temperature differential" point would avoid the problem, and we could always add something like "At the same time the water temperature will usually be affected by the surroundings – see explanation below." .... dave souza, talk 16:26, 12 November 2006 (UTC)
Please, folks, let's stick to the main-stream types of presentations on the main page. The phrase: "energy disperses (spreads out) from the water to the ice", for example, is completely non-standard, it has been a huge point of controversy on the talk page, it was recently at articles for deletion, where the majority of people had never heard of this term, and it is one of the lowest-ranked keywords according to Google search results. The correct and classical phrasing is: "heat flows from the hot body to the cold body and entropy increases." Remember, the concept of heat flow between bodies goes back to the 5th century B.C. days of Heraclites and is the core principle of thermodynamics. Let's try to write an article that we can be proud of. --Sadi Carnot 06:28, 13 November 2006 (UTC)
Sadi - I'm not going to get bent out of shape for a case where there actually is "energy dispersal" or "heat flow". I DO oppose the idea that ALL entropy change is "energy dispersal" or "heat flow". I'm not well versed on the conditions under which the standard ice/water/room example occurs, but we should state them explicitly in the caption, and hopefully avoid any misconceptions such as the idea that ice melting in water is always an entropy increase. What are the conditions for the ice/water/room example? Ice and water at 0 C and room at room temperature? Ok, fine the ice melts, the ice/water system stays at 0 C and the room - what? PAR 16:59, 13 November 2006 (UTC)

Par, I'm not either. But we shouldn’t put a controversial non-standard word in the one-sentence caption (below the photo); in the main article is fine. Through lengthy research efforts and long debate we have established that a few writers, e.g. Denbigh, K. (1955), Atkins, P. (1984), Leff, H.S. (1998), and Lambert, F. (2002), have taken it upon themselves to synthesize dispersal theories. Yet, these are, by far minority views. Below, for example, are the standard definitions of entropy to which caption should reflect.

  • Entropy – a measure of the unavailability of a system’s energy to do work; also a measure of disorder; the higher the entropy the greater the disorder.[1]
  • Entropy – a non-conserved thermodynamic state function, measured in terms of the number of micro states a system can assume, which corresponds to a degradation in usable energy.[2]
  • Entropy – a measure of disorder; the higher the entropy the greater the disorder.[3]
  • Entropy – in thermodynamics, a parameter representing the state of disorder of a system at the atomic, ionic, or molecular level; the greater the disorder the higher the entropy.[4]
  • Entropy – a measure of disorder in the universe or of the availability of the energy in a system to do work.[5]

References

  1. ^ Oxford Dictionary of Science, 2005
  2. ^ McGraw-Hill Concise Encyclopedia of Chemistry, 2004
  3. ^ Oxford Dictionary of Chemistry, 2004
  4. ^ Barnes & Noble's Essential Dictionary of Science, 2004
  5. ^ Gribbin's Encyclopedia of Particle Physics, 2000

Ice-melting around its critical point and the entropy changes associated with it is at least a two to three page discussion; you can't fit that all of this into a photo caption. As options, we could put more into the Ice melting example section or start a new page and put an extra footer link going to it in the photo caption. The main photo caption is supposed to well-represent and embody the basic tenets of the concept of entropy. For the moment, I added a footer ref note to the caption. --Sadi Carnot 17:50, 13 November 2006 (UTC)

"perfect internal disorder"

34 hits (as that seems so bloody important), all of them related to the Encarta article. Great cite. <--sarcasm. &#0149;Jim62sch&#0149; 00:37, 10 November 2006 (UTC)

Yes, thank you Jim62, I added another textbook source, i.e. the Principles of Biochemistry by noted American biochemist Albert Lehninger, per your request, that agrees with Encarta. Later: --Sadi Carnot
We still seem to be no closer to a readily understandable explanation of this unusual definition of "disorder" that equates it to equilibrium. Is there something in one of these references that summarises it for the general public? ... dave souza, talk 15:28, 10 November 2006 (UTC)
That Encarta article has some disordered writing, skipping between two definitions of entropy without clarifying the transition between the two meanings.[citation needed] ... Kenosis 15:57, 10 November 2006 (UTC)
I removed it -- the definition is unsupported by any other source. &#0149;Jim62sch&#0149; 23:27, 11 November 2006 (UTC)
To be correct, the removed section did not contain a definition. It contained a concept which is not defined in the article itself and is in fact supported (i.e. repeated) by many other sources. Which is not to say that I am in favor of emphasizing it. PAR 00:00, 12 November 2006 (UTC)
To be correct, the Encarta definition is only cited by sources cribbing from Encarta. &#0149;Jim62sch&#0149; 00:02, 12 November 2006 (UTC)
Callen, Landau and Lifshitz, Gurney, trust me, did not crib from Encarta. The disorder idea has a long and distinguished history even though it comes up short as far as I am concerned. Again, I am not in favor of emphasizing it. PAR 00:15, 12 November 2006 (UTC)
Refs please. Do they state the exact same nonsense? It is not a matter of disorder, and I think you know that. It is the matter re equilibrium and perfect internal disorder. Let's try to stay on subject here, shall we? &#0149;Jim62sch&#0149; 00:35, 12 November 2006 (UTC)
Interesting that none of the refs provided by Sadi support Encarta. &#0149;Jim62sch&#0149; 21:01, 12 November 2006 (UTC)

Let me see…here we are trying to write an encyclopedic article, but we can’t use an encyclopedic source? No, I don’t think so. Encarta, or for that manner any other encyclopedia, textbook, reputable book, notable article, etc., is a stand alone source. Encarta happens to be the best-selling encyclopedia software brand in the United States (source: NPDTechWorldSM, March 1993-March 2005. Based on Total U.S. retail sales) A stand alone source is simply that, it stands alone by itself as a reference stating their point of view. If you have a problem with their view, than send them a letter. This talk page is not the place to debate the merits of established theories. We are here to write an exceptional article, from all points of view, on the topic of entropy. --Sadi Carnot 06:09, 13 November 2006 (UTC)

"when all the molecules are in one place"

See Quantum Mechanics: the molecules are never all in one place. Oh yes, they could be at 0K, but as we know the uncertainty principle rules out 0K being an achievable temperature. So, is this a direct quote from a source, or a synthesis? I'm guessing the latter as the ref deals with applied mathematics. &#0149;Jim62sch&#0149; 21:01, 12 November 2006 (UTC)

Yes, this is a direct quote from a source (that's why there is a footnote next to the sentence). --Sadi Carnot 05:54, 13 November 2006 (UTC)