In E2, 130815, Thursday AM, they made extensive use of analogies between energy and money, when dealing with loss/waste vs. usefulness. It is different if there are 4 people with $100 each vs. 1 million people with one dollar each, with examples when the rich and poor get to interact.
Leslie came up with this idea: "It's not how lucky I am, it's where I live." In other words, what you can do with a certain amount and form of energy depends on the energy inthe surroundings.
Sture Nordholm has thought more of how to map thermodynamics to money in education:
http://pubs.acs.org/doi/pdf/10.1021/ed074p273.
I like this analogy by Paul Atkins http://www.google.com/books?hl=sv&lr=&id=kJyXzvkXWBAC&oi=fnd&pg=PT2&dq=atkins+galileo%27s+finger&ots=Z_5l1WXMZZ&sig=Ukrtm9TUcfrbpde8uY7YbgGEeKo:
"The analogy I like to use to show the connection [between the interpretations of entropy in macroscopic thermodynamics and statistical mechanics] is that of sneezing in a busy street or in a quiet library. A sneeze is like a disorderly input of energy, very much like energy transferred as heat. It should be easy to accept that the bigger the sneeze, the greater the disorder introduced in the street or in the library. That is the fundamental reason why the ‘energy supplied as heat’ appears in the numerator of Clausius’s expression, for the greater the energy supplied as heat, the greater the increase in disorder and therefore the greater the increase in entropy. The presence of the temperature in the denominator fits with this analogy too, with its implication that for a given supply of heat, the entropy increases more if the temperature is low than if it is high. A cool object, in which there is little thermal motion, corresponds to a quiet library. A sudden sneeze will introduce a lot of disturbance, corresponding to a big rise in entropy. A hot object, in which there is a lot of thermal motion already present, corresponds to a busy street. Now a sneeze of the same size as in the library has relatively little effect, and the increase in entropy is small."
No comments:
Post a Comment