Talk:Maximum entropy thermodynamics
From Wikipedia, the free encyclopedia
[edit] From stub to article
To do:
- technical note that strictly the entropy should be relative to a prior measure. -> Principle of minimum cross-entropy (Kullback-Leibler distance). In thermodynamics we usually assume the "principle of equal a-priori probability" over phase space, so the two are then equivalent.
- section on philosophical implications regarding the conceptual problems of statistical mechanics, second law, etc.
-
- -- (?) DONE Jheald 22:07, 2 November 2005 (UTC)
- (?) some more algebra, and a simple nonequilibrium example (eg Brownian motion?)
-- Jheald 12:47, 28 October 2005 (UTC)
[edit] Introduction could be more friendly
(from the Article deletion page):
Note to author - please consider adding a few paragraphs up front in layman talk before getting on to the partial differentials. There ought to be something you can say about maximum entropy that I can slip into a casual conversation. Denni☯ 23:56, 28 October 2005 (UTC)
- You know, I tend to agree with you. But the problem is that whatever egghead wrote the damn article probably would have real trouble talking to a regular human being. Someone that bright probably would have trouble answering a question as easy as "Hey man, what's up?" You'd probably get as an answer some canned formulaic response that they'd learned to the question if not some Larry-Wallesque humor. If anything, I really fucking like this article? It reminds me of the good old days of wikipedia, when PhD level researchers from a buncha universities would pop on during a coffee break and spit out (arguably useful) articles on cutting edge theory, heavy academia or particle physics. Eventually the whole project got taken over by hack librarians, star trek bloggers and geeky high-school students, with the result that they pretty much demanded a level of organization to the writing that your average professor was unable/unwilling to do on their coffee break. This scared them off and most articles, even about important subjects, end up bland, overly-verifiable and smelling like they were written by some windex-and-spam fried committee. Honestly, the only good articles left on wikipedia are the ones on topics so obscure that people find nothing in them to be contentious. Like this article, for example. But anyways, brother I certainly see your points. Sorry for the outburst. Long-live Myspace, Facebook, Wal-Mart and Tom Hanks. —Preceding unsigned comment added by 10.250.65.158 (talk) 01:22, 6 October 2007 (UTC)
- Me again. There's the additional problem that sufficiently complex information reaches a point of irreducibility beyond which any further boiling deprives the subject-matter of its inherit factuality. (c.f. What the Bleep do we know http://en.wikipedia.org/wiki/What_the_Bleep_Do_We_Know%21%3F , a film designed to make quantum mechanics comprehensible to people of average intellect and new-age spiritual interest). You try to make something like that so that regular folks can understand it, and the result is a hodge-podge of miswrought analogies and irreconcilable metaphors. Really, people are better off knowing that they just don't fucking know anything about a particular subject than thinking they can define five or ten vocabulary words of the industry jargon.
[edit] Average entropy, measured entropy and entropy fluctuations
At the moment the article isn't very clear as to when it's talking about expectations (either as a constraint, or a prediction), and actual measurements. For example, in the discussion of the 2nd law, the measured macroscopic quantities probably won't come in bang on the nose of the predicted -- instead (we assume) they will be within the margin of predicted uncertainty.
This especially needs to be much cleaned up in the context of entropy, particularly if we're going to discuss the fluctuation theorem.
Also, the new measurements will therefore contain (a little) new information, over and beyond the predicted distribution. So it's not quite true that SI is unchanged. It will still be a constant, but strictly speaking it will become a different contstant, as we propagate back the new information, sharpening up our phase-space distribution for each instant back in time.
-- Jheald 15:51, 1 November 2005 (UTC)
- -- DONE Jheald 22:07, 2 November 2005 (UTC)

