Link to home
Start Free TrialLog in
Avatar of prashant_n_mhatre
prashant_n_mhatre

asked on

Meaning of ENTROPY

"Entropy is a measure of randomness and entropy of the universe increases."
I studied this few years back without understanding. Whenever I imagine to explain this sentence, I cannot come up with examples.

Can someone explain this in plain english?




Avatar of lexxwern
lexxwern
Flag of Netherlands image

im a programmer so this is my best explanation



infinite universe loop()
{
a = srand(entropy);
entropy++;
}
srand is a random function which gives back to "a" a value between 0 and entropy,

entropy++ is the basic incrementation of entropy
Avatar of Enabbar Ocap
The famous example is dropping a tea-cup.
Initially the molecules in the clay are ordered in the shape of a cup. When you drop it and it breaks the molecules are in a more random order. If you pick up all the pieces and shake them you can be fairly certain that they will not re-form into the shape of a cup.

It's got something to do with the direction of time, you can tell which way time is moving by the increase in entropy.
Avatar of riccohb
riccohb

The teacup example is always a good one. There is a law of physics which says that entropy always increases. What this means in practice is that, left to its own devices, everything decays.

It's also related to the total amount of energy in the universe. The total amount of energy can never decrease (where would it go?). The thing is that objects with low entropy (ie they are highly 'organised', like an unbroken teacup) have more energy than those which aren't (the broken teacup) because it takes energy to keep them that way.

My desk is really untidy, with things strewn at random on it, so it's got a very high entropy. If I were to tidy it up I would be expending energy by doing so, so the total amount of energy in the universe would decrease. However, I would be decreasing the entropy of the desk by tidying it up, and if you measured the energy in the desk (held because of it's low entropy) you would find that it exactly cancels out the energy I've spent in tidying it, so the total energy of the universe has actually stayed the same.

The fact that entropy always increases with time means that an unbroken teacup will smash when dropped (it's entropy will increase) but a smashed one will not reform itself when dropped (which would mean an increase in entropy).

Erm, I seem to have strayed off the point a bit, but I hope that helps.
learning
ASKER CERTIFIED SOLUTION
Avatar of acerola
acerola

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
From memory there are certain circumstances under which entropy can actually decrease. As far as I remember this result comes from (what I was taught as) the orginal derivation of entropy. This examined the case of a chamber with a partition. One section had a gas and the other section did not. As time goes on we'd expect the gas to cover the whole chamber. However, due to probabilities and given an infinite amount of time, it is possible that the gas may realign itself to it's original configuration. This is so incredibly unlikely that realistically it will never happen - but given an infinite amount of time, it WILL.

I appreciate that we're dealing with something incredibly unlikely, particularly when we deal with larger systems, but it's still an important point. If people dispute my recollection of things, I'll try to find the book that taught this fact. I have a feeling Roger Penrose's "The Emperors New Mind" tackles the subject quite well...
Here's a link that confirms my suspicions:

http://scienceworld.wolfram.com/biography/Boltzmann.html

What is the difference between an event that never happens and a event that takes forever to happen?

This reminds me of a discussion about if it were possible for one person to win the lottery. On one hand, some people alredy won, so it is possible. But it is so unlikely (1 chance in millions) that if this was a physical event it would be considered an impossible event.

Sometimes in physics you consider impossible an event that has an extremely low probability to happen. If you have a electron in canada, it is "possible" to measure its position as beeing in japan, but the chance is so extremely low that you can say that it is impossible for a electron in canada to be measured in japan.
You've precisely countered your argument. While "a" person may win the lottery, an individual may not. We don't categorically know how big the universe is. We don't know if there is a "multiverse". So considering such events is theoretically productive. There was a theory that our viewable universe was in fact one of these aberrations. Of course, it can never be proven (which means it can't be called a theory?).

I'm not trying to be annoyingly pedantic - rather just looking for completeness. You can't do any statistics unless your probabilities add up to 1. The normal curve doesn't exist without those really unlikely probabilities. Nor does integration. The only reason I mentioned this little element was because it's slightly counter intuitive and often left out. It only makes up a tiny part of what entropy is all about!
"You've precisely countered your argument. While "a" person may win the lottery, an individual may not."

Let's say that the lottery has 50 million possible combination of numbers. If 100 million people play it, there is a high probability that one or more persons win it. But the probability that you win it is very very low.

The probability of ONE specific person winning the lottery is so low that it can be considered impossible. But the probability of ANY non-specific person winning the lottery can be very close to one.

Making a calculus analogy, the probability of one person winning is the differential. The probability of any person winning is an integral. A differential has a value so small that alone it can be considered zero, but when you integrate the differentials, it may have a big value.
Interesting recent development about entropy...

http://www.newscientist.com/news/news.jsp?id=ns99992572
The laws of thermodynamics:

1st:  You cannot win, you can only break even.

2nd:  You can only break even at absolute zero.

3rd:  You cannot reach absolute zero.
An easy way to understand entropy is that is is the tendency of energy to distribute itself amongst all the available states.  It has a very real effect, and explains why some substances explode.
 
An explosion is when a substance has a choice about rearranging itself, and the choices that involve greater numbers of fragments are preferred, because then kinetic energy can occupy more possibilities.

This is such a powerful tendency that the desire of energy to occupy more states will even drive some chemical reactions to absorb heat instead of emitting it, and as you mix the chemicals in a beaker it can actually begin to freeze.  

So to predict the direction of a chemical change, chemists have to not just look at the energy change, but also the different ways that energy is able to distribute itself.
Incidentally as far as the gas example goes above, it is possible for the gas to defy entropy, but "if you wait form very long time" can be better interpreted to mean "we can never know when this is going to occur with any certainty".  I suspect that it would never happen because quantum theory would still operate, The uncertainty principle stops the gas from having a precise energy at a precise time.  
While not taking time for entire q or review above,

entropy on universal scale should refer to an expectation of a cooling off after the heated explosion

or to an elasticity that has gotten some rubber band to become less responsive over time and usage

so imo, i think it is about mechanics involved on an extremely large scale

and how a universe, or a god, is getting tired or bored

if you want more, say so (reject proposal) and add some fire here
I think you are all getting far too technical.  Entropy can be considered as a measure of chaos or disorder;  The higher the entropy in a given phenomenon (or situation), the less amount of order (hence the broken teacup could be considered a 'more chaotic' version of an unbroken cup...)

Cant really go in to too much detail cos I'm in a rush, but that's the way I have always considered it....
In plain english - energy can be measured - it has a quantity.  But is also has a quality.  That is, also occupies states.    Take a fluid.  There are a couple of ways of looking at how the heat energy behaves in the fluid. You can either say that the molecules of the fluid possess an energy, (related to their speed), or there is another way - you can imagine that the energy possesses the molecules!  In other words the energy spreads itself around the molecules, so that the energy itself manages to occupy more states.  
Its the same with chemistry.  Without entropy, chemistry just wouldn't work at all.  The basic state of chemistry is just equilibrium, no change.  
There are two drivers for chemical change.  
One is the desire for energy to be lost from stuff and head out into space.  Chemicals get less reactive over time by losing their stored energy.  Batteries run down.
petrol burns.
The other is the desire for energy to split its bets at the casino of stuff.
If a substance can rearrange itself four different ways say, then the energy goes for the case that gives it the more freedom.  Same energy, diffrent configuration.  So if there is a way of making 10 bits from one bit, that is where the energy goes.  The energy is the same, it just likes to broaden its options.  So what you do is build a molecule with lots of nitrogen bonded into it in different places, and then apply just a teeny bit of energy (a detonator) that gives the bits a chance to rearrange slightly.  Then the desire of energy to hedge its bets takes over.  It much prefers the arrangement where there are 10 nitrogen molecules, than one or two or three or four something elses.
But if you are suddenly a gas, and you find you have ten molecules moving when you had only one or two before, you will have ten times the pressure or more.  Oh well then you just have the thing we call the explosion.  Don't blame me, says the energy, I saw a chance to split up and inhabit more hosts so I just grabbed it.  
For a physical change instead, consider washing soda.  It has ten water molecules all bonded to it when you buy it in the packet.
Now when you put it in a jar of water, there is enough jiggling to encourage the bonded water to move a bit.  The energy in the free water rejoices, and it sees a chance to spread itself into more water states.
If only the bonded water would let go.  It needs energy to let go.  The free water gives up some of its heat energy just to let it be shared with the new water.  The bonded water breaks away to join its cousins.  The energy cost of this dance is incredible, and the water in the jar spontaneously chills.  It can even freeze.  Just so energy can distribute itself into more states.

Sometimes the discussion revolves around heat engines and stuff.  Its the same idea, how the energy is carried.  Doing useful work always involves focusing energy into
more narrow configurations, ie or fewer states.  Moving  a piston is a total straight-jacket for energy.  Any chance to escape those duties and it will, so you never get 100% recovery.

Often you hear entropy being talked about as randomness, and the tendency to disorder etc. because you can calculate it without mentioning the energy that gives it its raison d'etre.

But I think you should give the points to lexxwern.
The very first post.
His definition is brilliantly simple.
Avatar of prashant_n_mhatre

ASKER

Thank you all for your help...I feel acerola's answer is very close to what I am expecting.
Thank you very much for the points.

Besides telling you the "arrow of time", entropy is used to calculate other stuff in physics. Usually, you take three steps in statistical physics to calculate real life stuff:

1 - Standart theories: determine the model (classical or modern) that describes the fenomenon in a very small scale, such as: "what is happening with the individual atoms in a gas"

2 - Statistical physics: use statistical physics to calculate the entropy (or a similar of entropy such as helmholtz potential or gibbs free energy) for a very large system, with zillions of particles

3 - Thermodynamics: use the entropy (or similar) to calculate magnetization, pressure, temperature, volume, and other "real" stuff that can be directly measured.
Just as a matter of curiosity... any comments about the link that I provided about the second law of thermodynamics being broken? I know it doens't give the actual script - but it's interesting to think that we may not know as much as we think...
That is interesting but I suspect they forgot something.

BTW Entropy is measurable.  Any good book on chemistry has tables of standard entropies for chemical substances, and to a chemist entropy is very real.
 
Also I suspect this guy is doing a term paper :)
"Any good book on chemistry has tables of standard entropies for chemical substances"

Sure. They are all calculated. You don't have a machine/detector to measure entropy. No entropy-o-meter.

"and to a chemist entropy is very real."

Real as beeing a physical quantity, yes. Real as beeing measured directly, such as mass, lenght, time, etc, no. Depends on how you define "real". Semantics.

Chemist most often use the "Gibbs free energy", which is a diferent formulation for entropy.
gd2000,

the quantum world is a very weird one. many classical laws are broken there. you can convert energy into mass, viloating the classical laws of conservation. you can have energy/force/particles comming out of the vaccum (cassimir effect). you have fluctiations on stationary states, uncertanty, random/unpredictable events, and many other stuff.

while taking the undegraduate course on statistical mechanics i saw an experiment that made a magnetic system reach a temperature below absolute zero. very weird. another violation of thermodynamics.

like all theories, thermodynamics has its limits. it can't be aplied to everything all the time. this experiment doesn't prove that thermodynamics is wrong, it just draws the limits where we should use it.

about the article, it was published in Physical Review Letters, which is considered to be the best periodical on physics today. PRL has very good referees. it is very hard to have an paper accepted unless it is real good stuff.
>>it just draws the limits where we should use it.

Thermodynamics has to be the most encompassing theories I have ever come across, so any suspected chink in the armour is highly newsworthy. I am automatically sceptical becuase of the grandeur of thermodynamics and the possibility of overlooking the subtleties that would retain a law.  On the other hand Quantum Theory is just as ubiquitous, so here we have an intersting mix.  Normally thermodynamics is well respected by Quantum Theory.

What seems to happen most often in science is that the exception to a law grants an opportunity to extend or reformulate a theory, and in most cases the law or theory becomes even more encompassing, and often reveals the original law or theory or formulation to be a special limit case in a theory with a grander vision still.  

In that sense the original law is not defeated, or broken, or proven wrong, perhaps it is best to describe it as being "subsumed" into the new theory.

There are very few cases where a theory was eventually shown to be misguided in the first instance, and served to retard knowledge - the most infamous of these cases is the "Phlogiston Theory" that makes for a fascinating story.
The Gibbs equation is G=H-TS, where the energy component appears as "Enthalpy", and S is the entropy component of the free energy.  Gibbs was a total genius.  Entropy is measurable directly, and can be derived as a log function of probability.
enthalpy is another transformation for entropy, as it gibbs free energy and helmholtz potential. it is also much used by chemists. the difference between them are the variables you use in the problem. if i am not mistaken, entropy uses temperature, volume and number of particles. the others may use pressure instead of volume, chemical potential instead of number of particle and i dont remeber what replaces temperature (entropy itself?). i have to check Kallen (the best book on thermodynamics in my opinion).

"and can be derived as a log function of probability"

sure it can. DERIVED not MEASURED. that is the basis of statistical physics. boltzmann postulated (and then killed himself because everybody told him he was crazy for proposing that):

S = k*log(W)

k is the boltzmann constant. W is not actually probability. it is the microcanonic ensemble. it is a sum of all possible states a system can have. S is entropy.

this is the connection between standart theories (classical mechanics, quantum theory, etc) and thermodynamics. you use standart theories to calculate W and then use this formula from statistical mechanics to know the entropy. there are also similar equations for enthalpy (H), helmholtz potential (F), and gibb energy (G).

check out: http://world.std.com/~mmcirvin/boltzmann.html

"Entropy is measurable directly"

HOW??? So far you have just shown ways of CALCULATING it. Have you ever used an entropy-o-meter? I haven't.
So quickly - the distinction you are making is already well known in chemistry - its the definition of two different kinds of variables, they are called
"intensive" variables and "extensive" variables.  

I cannot answer all your points just now, as since it is my girlfriend's birthday, and she pretends not be be fond of birthdays's so I hope to change her mind.  Or I will find myself with an expensive variables.

There is no intrinsic justification for challenging the status of an intensive variable as a model of reality, any more than there is for extensive variables.
The classic case is the extensive variable "heat" and the intensive variable "temperature".

The intensive variable "temperature" is measured fairly directly, with a therm-o-meter.  
The extensive variable heat can be measured too, but it is just a lot more difficult and indirect, you have to build a calori-(o)-meter.

Entropy is not an intensive variable like temperature so it is less easy to construct an "entrop-o-meter".  In fact entropy is a classic case of an extensive variable, it needs to be more complex like a calorimeter, where you measure the change in other intensive variables between two states.  
Such measures are routinely made in chemistry, except that just as you now have a vast knowledge of the thernal properties of materials, we also have a vast knowledge of the entropy states of most materials, and we can calculate changes accurately without needing to make such measures in the lab.

Another point needing clearing up is that entropy and theromodynamic principles are consistent with quantum theory, and there are many such applications.  For example the entropy change as a superconducting material undergoes a phase transition can be calculated and then measured in the lab to test theory.
"the distinction you are making is already well known in chemistry"

chemists are evolving as they study more physics. i dont remember making that distinction but since you mentioned it here it goes:

extensive: volume, number of particles
intensive: pressure, chemical potential

i am pretty sure that temperature is extensive and its intensive equivalent is entropy, but i have to check on that.
i dont agree with your expanation about measurements. volume is extensive and you can measure it. pressure is intensive and you can measure it. i see no relation between beeing intensive or extensive and beeing able to measure it directly.
"it is less easy to construct an "entrop-o-meter""

ah, so you agree that entropy has never been measured directly, just calculated. you said before:

"Entropy is measurable directly"
"Another point needing clearing up is that entropy and theromodynamic principles are consistent with quantum theory, and there are many such applications"

it is not easy to understand how you go from standart theories, such as classical mechanics or quantum theory, to thermodynamics, using statistical physics. the point is that when you apply statistical mechanics to quantum problems you SOMETIMES find violations on the classical thermodynamics postulates. just another evidence that the quantum world is not like our common sense might expect.

"For example the entropy change as a superconducting material undergoes a phase transition can be calculated and then measured in the lab to test theory"

supercondutivity is a great example. we have two kinds of superconductor materials. type one has a very good theory (cooper pairs), formulated by Bardeen, Cooper and another guy i dont remember the name. but type 2 is still unexplained. type 2 superconductors are the ones with high temperature (about 70 kelvin). the models of statistical physics and thermodynamics fail to describe them correctly, and many physicists today are researching a good explanation for this problem (some of them even use quantum field theory to solid state physics).
All measurements are indirect in some sense, and that says nothing about the status of the property in question.

Entropy is perfectly real and significant in the physical world, it can flow across boundaries, and is just as real for systems that are not in equlibrium.

Entropy is not mysterious, it is a measure of the spread of energy across the available microstates, which is why it can be crudely equated with "disorder" in the popular imagination, and leads to all kinds of silly mimbo-jumbo about life being a "negentropic" force and other nonsense.

It explains perfectly the concept of "work" because to do useful work you always have to align real materials, to constrain them to act together in certain ways, as for example the piston in your car, you are restricting possibilities to energy and that requires a cost in the greater plan of things, there must be an overall greater freedom to energy allowed elsewhere.

The behaviour of substances such as superconductors is perfectly consistent with thermodynamics.  Quantum mechanics is just that, mechanics, and mechanical systems obey thermodynamics.  

Statistical mechanics on the other hand is not the same thing as thermodynamics.  Statistical mechanics is the  attempt to predict the thermodynamical behaviour by building models incorporating various statistical principles.  What you are referring to is the breakdown of these statistical postulates, not the basic laws of thermodynamics.  Of course quantum theory produces different statistical behaviour, and that allows us to rescue thermodynamics and show how real substances really do obey the laws of thermodynamics, and where the entropy really is.  But by no means does quantum theory challenge thermodynamics itself.

I have in front of me a book called "SI Chemical Data" by Aylward and Findlay, and it tabulates some 100 pages of data on chemical substances, with tables of values for entropies.  For example sodium chloride (common salt) in the solid state has an entropy of 72 Joules/Kelvin/mol.  It also lists the value for common salt in the gaseous form, which is 230 Joules/Kelvin/mol.
The standard entropies of evaporation of liquids at their boiling points shows a remarkable uniformity.  "Trouton's Rule" puts it at about 88 Joules/Kelvin/mol.  (The mol is the standard unit for the concentration of a substance).

Here's a great reference:

http://www.people.cornell.edu/pages/jag8/thermo.html#entropy

Points out that the "disorder" view is mainly baloney.
It is also worth mentioning in this context that the area of a black hole is proportional to its entropy.
All you said seems correct, but you still didn't say how you measure entropy. You only said different ways of calculating it.
"All measurements are indirect in some sense"

Not all, but many. When you measure lenght with a ruler, it is a direct measurement. You are comparing directly the length of the ruler with the length of the object. When you measure lenght with a laser, it is indirect, because you use the speed and the time to calculate the length.

Also, counting is a very direct measurement.

A scale measures weight directly (comparing directly the weight force done by the two objects in the trays or comparing the weight force with a spring's elastical force). but to know the mass you need to know the local gravity value to do the calculation. Someone's weight is different on earth and on the moon, but the mass is the same.

A clock also performs a direct measurement of the time. It compares directly the time its pendulum take to make a full cycle with the time of the event you are measuring.
The entropy of chemical substances I gave you were not calculated from probability theory, they were determined in the lab, and there is a corresponding experimental error attached to each as you would expect.

There are any number of measures that represent the area under a graph, or the slope of a graph, it matters not.
If you like you can get the entropy as the rate of change of free energy with temperature, it is easy got from specific heat measurements, and depending on what variables you want to hold constant it can be got in other numerous ways.
 
Answer me this then if you insist on attacking the status of entropy - all your arguments apply equally to HEAT then.
tell me how do you measure heat?  We DON'T!  We control other variables and measure other things like volume, presure and temperature.  It is just as hypothetical, and just as real.  There is absolutely no ontological difference.  
"There are any number of measures that represent the area under a graph, or the slope of a graph, it matters not"

area under a graph = integral (calculation)
slope of a graph = derivative (calculation)

all indirectly measured (or calculated)

"Answer me this then if you insist on attacking the status of entropy"

i am not attacking it. i am just saying that it cant be measured directly, only calculated from other stuff that you measure directly. i am not saying it is not real.

"all your arguments apply equally to HEAT then"

sure. heat is energy. i cant think of a energy measurement made directly. they are often calculated.

"We control other variables and measure other things like volume, presure and temperature"

correct. and use them to calculate heat. and then use heat to calculate entropy. entropy is always calculated. there is no machine to measure entropy directly.
Then there is no machine to measure temperature directly.
Now you are beginning to understand it.
There is no way to time either then.
I am sure I have got it now.
So rulers for measuring length are much better than clocks for measuring time, especially digital ones because....
You measure time using length!
I better keep my old sundial then, lucky I didn't throw it out!
No wait, I can count the ticks on my grandfather clock, so that must be better than my digital watch.
I better start taking that to work.
So I must understand that counting scratches is a direct measurement of length.  Hmmmmm.  No.  I will have to try harder.   So I need to know that counting is not pure calculation.  Hmmm.  No, I just don't understand the difference between calculating and not calculating length.  

But then, I always imagined that you could multiply and divide real things as well as numbers.

Boy was I silly.

I mean at school I learned to multiple numbers.  4*6=24
Then as I was growing up, one day I figured that you could multiple things that were not numbers.  Call me dumb, but I was under the strange delusion that you can multiply a real length by a real length and you got something called an area.  You mean there are people who have never understood that multiply doesn't just work for numbers, it works for things too?

And call me mad, but I actually also thought that you can divide real things like sticks, just like you do numbers, and that you get a real number out as the answer, even though the inputs were sticks!

mystick / yourstick = anumber !!!!

Boy was I dumb!

I even imagined that the French has a special stick called the meter they used for that.  Silly me.  If I had only known that you didn't need to calculate at all!

Our ancestors got by without knowledge of metric 'system'. They got by just the same.
That would make a good question. Why on earth does the US cussedly stick to the clumsy old imperials.
The basic mistake acerola is making is that he is confusing making measurements quickly, with making them directly.  All the measure-o-meters do is just make measurements quickly. The speedometer in your car does not measure speed directly, it measures speed quickly.  Its an easy mistake to make without thining about it, because we like to think that data that gets to our senses quicker is more "direct".  But its just an illusion.  Just like computers, its all smoke and mirrors.  It takes time and care and patience to measure heat, whereas we can get temperature change more quickly from voltages across metals or thermal expansions, ie from the various theoretical effects of the property.  Its the same with entropy, we can measure it from other properties.  The only rason we don't build an entrop-o-meter is not because we can't, its just that we don't want to.
And the reasons for that is human and to do with how we apply technology.  Be warned!  As the petroleum runs out we may yet see the ministry of entropy coming round with their entropometers to scan your place for wasteful energy consumption practices!
And of course there are energy measuring meters, otherwise how would the power utility know how to charge you on your power bill?
The role of entropy as an agent of change, and the dream of energy for nothing, sometimes leads to proposals for perpetual motion machines that implicate entropy without considering it explicitly:

https://www.experts-exchange.com/questions/20317067/Electrolysis-at-pressure.html
Ok, let me define what is a direct measurement. A direct measurement is performed when you compare a physical quantity with the SAME physical quantity of a reference.

When you use a ruler to measure lenght, you are comparing the lenght of the rules with the lenght of the object. Or, you can use your foot. You compare the lenght of your foot with the lenght of the object measured. You then say that the object has 3.5 feet, that means that its lenght equals to 3.5 times the lenght of your foot. You are comparing lenght with lenght.

Now time. A granfather clock has a constant time of oscillation of the pendulum. You compare the number of ticks with the time of the event you are measuring. So if you measured 11 ticks, you are comparing the time of the event with the time of the tick.

I don't know exactly how a digital clock works, but I know some have a quarts crystal in it that "ticks" electronically. With them you are comparing the time the circuit takes to make a loop with the time of the event.

Weight can also be measured directly with a scale. If you put a melon in one tray and, let's say, 5 apples in the other tray, you can say that the melon weights 5 apples, or a apple weights 1/5 melon. The unit doesn't matter. It can be kilograms, pounds or whatever. The measurement is direct because you compare weight with weight.

Unfortunately, most physical quantities can't be compared directly. Or it is very hard to do so. So we make a indirect comparison, a indirect measurement.

Temperature, as stated above, is usually measured indirectly. A termometer has a fluid inside that changes its volume as it heats. You are using the volume of the fluid to measure the temperatue of the object. You must know how the fluid behaves in order to make the calculations and know the temperature. You are comparing volume with temperature. To do so you must do some calculation involving the physical properties of the fluid.

The electrical energy meters behave much like current and voltage meters. They use electro-magnets to make the measurement. They don't compare the energy flowing through them with some other energy. I don't know about digital meters, but analog meters use magnetic induced force to mearuse indirectly the voltage and current. It is like a magnetc scale. You put electricity in it and the pointer moves due to magnetic force. It is indirect because you are using the magnetic force to measure current, voltage and energy. (energy = voltage*current*time).

Electrical current is the amout of electrons that flow through something in a given time. It is possible to count them, but it is not very easy. So we attach a device to the circuit that has the same magnetic scale to make the measurement. We are comparing the number of electrical charges with a magnetic force.

Ok, now about measurements of energy and entropy.

Objects don't have intrinsic energy on them. You can only measure change in energy. You can't say that a apple has 100 kJoules of heat. You can only say that for a apple to raise x degrees of temperature it needs y joules of heat energy. The only thing that comes close to intrinsic energy is maybe the famous E=mc2 equation, with would give a "intrisic" energy that would be liberated if you make a matter anti-matter anihilation.

Same for entropy. You can't say that a object has x entropy. You can only measure (calculate) change of entropy.

But, just because some quantity can't be measured directly it doesn't mean that it is less real. It just means it is a bit harder to understand. I myself find the concepts of temperature and energy difficult to understand. Entropy is by far one of the hardest to understand.
"So rulers for measuring length are much better than clocks for measuring time"

A direct measurement doesn't mean a better measurement. Not necessarily. You can use a laser to measure lenght indirectly and get a better result than with a ruler.
"You measure time using length!"

I would say "you can measure time using lenght". That is pretty much what a analog watch does. It moves at a constant speed, so if you measure the lenght, you know the time.

If you use a candle to measure time, the flame goes down at a constant speed. You measure the lenght of the candle and you know the time. That is how people did when there were no clocks.

But you can also measure time directly. You just have to compare a reference time with the event you are measuring. For example, ticks of the clock.
"No wait, I can count the ticks on my grandfather clock, so that must be better than my digital watch"

Not really. If you have a electronic circuit to count for you wouldn't it be better?
"So I must understand that counting scratches is a direct measurement of length"

Yes.

"So I need to know that counting is not pure calculation.  Hmmm.  No, I just don't understand the difference between calculating and not calculating length."

Ok, here we go to the semantics again. ;)

Yes, counting is a kind of calculation. But the calculation I was reffering to are those that involve different physical quantities. Like in the termometer you count the scratches that gives you the volume of the fluid and then you calculate the temperature using the physical properties of the fluid (i don't know the name of the property in english, maybe its dilatation coeficient?).

And you must agree with me that people don't usually use the word calculation for counting.
"I always imagined that you could multiply and divide real things as well as numbers."

Sure you can. You can say that a melon weights five and a half apples. Or that a grape weights one tenth of a apple. etc etc. Numbers exist to represent real things.
"I even imagined that the French has a special stick called the meter they used for that"

"Our ancestors got by without knowledge of metric 'system'. They got by just the same"

The unit doesn't matter. You just have to get a constant reference. Some people use a meter that has 10 divisions. Some other people with the foot that has 12 divisions. You could use kilometers, miles, light-years, parsec, whatever. The unit doesn't matter. I even used the unit "apple" for weight in my examples.

Unit has nothing to do with direct or indirect measurement.
"The basic mistake acerola is making is that he is confusing making measurements quickly, with making them directly"

I don't remember saying anything about how long you take to make the measurement.

"The speedometer in your car does not measure speed directly"

You are correct. Where did I say that speed was directly measured? It probably could, but it would be hard. I myself dont know how.

"The only rason we don't build an entrop-o-meter is not because we can't, its just that we don't want to"

My question remains unanswered. HOW DO WE BUILD A ENTROPY-O-METER??? You don't have to build one, just tell me how can you compare entropy with entropy directly?
"Then there is no machine to measure temperature directly"

"The speedometer in your car does not measure speed directly"

I agree with you. Oh, wait. NO! We can't agree on something. Sorry, now you must change your point of view.
Please excuse the intrusion but, On the subject of measuring time, I read this example once:

Imagine yourself sitting on the bank of a small river. On the bank are two markers. You can measure the speed that the water is flowing by throwing floating objects into the stream and timing how long they take to travel between the two markers. The clock you are using is conveniently situated on the side of a building on the opposite bank. After several measurements, you cross the river to find that the clock mechanism is driven by a water wheel that is powered by the river.

The point is that we don’t actually know how fast time goes – it may vary and we have no way of telling, we can’t step outside of it, and any calculation we perform involving time (speed of river, gallons per second etc.) is affected by the speed of the river/speed of time. A digital watch, the speed of light or the swing of a pendulum may all vary, we treat these as constant, but as are all affected by the thing they are measuring we can never be sure.

Can we still use our perception of time when measuring entropy?

Of course there is a unit for entropy.  I already gave it to you.  Its Joules per Kelvin.  

Of course you can define an entropy scale.  Quite commonly absolute zero temp is absolute zero entropy.  Are you still using Fahrenheit in the US imperial systems?  I cannot remember how many minus degrees that is in F, its so long since I used any imperials.

Of course it can be measured.  Why do you think we are warm blooded?  

The food you eat gives up energy when we digest it.  But we don't interact with its entropy, it just gets a heap more as we smash it up in our stomachs, but we expel that so forget the entropy of food.
 
We just want the energy.  Think about it!  When we use energy from food we convert energy to motion, so we should get COLDER not hotter.  So why do we get hot?  Because we need to maintain low entropy structures.  The only way we can do that is to export entropy as heat!  
You want an entropy meter?  You are living with one!


Robin D
I will start a new question on these things.
See you there

:7)
"Of course there is a unit for entropy.  I already gave it to you.  Its Joules per Kelvin."

Where did I say there wasn't? There are as many units as you like. It is energy/temperature. Any unit of energy divided by any unit of temperature.

"Of course you can define an entropy scale"

Sure you can.

"Quite commonly absolute zero temp is absolute zero entropy."

Wrong. There is no intrinsic entropy. As there is no intrinsic heat.

"Of course it can be measured."

ONCE AGAIN: HOW DO YOU MEASURE ENTROPY??? I am getting tired of asking this.

"Why do you think we are warm blooded?"

You tell me. I don't know what entropy has to do with us beeing warm blooded. I thounght it was evolution that made us this way.

"When we use energy from food we convert energy to motion, so we should get COLDER not hotter."

The energy conversion is not 100% effective. Some of the food's energy is converted to movement and some of it is lost in heat. Just like when you put electricity on a copper wire. It heats up a little and you lose energy. Please, come back to the point.

Plus, alligators, snakes, frogs, fish, all of them eat and convert energy into movement. Why aren't they warm blooded? You explanation is not very good.

"You want an entropy meter?  You are living with one!"

Am I? Where is the dial?

So far you haven't presented a entropy-o-meter. Not even a indirect one. You keep changing the subject.
RobinD, your point is very good.

All measurements are comparisons. When we measure time we compare the time of something with the time of something else. And yes, at some point we must assume a constant reference to perform the measurements.

In our case (humans) we assumed that the time Earth take to make a full revolution is constant. We called it day. We divided it in 24 parts, which were divided in 60 and then again in 60 giving us the second, which is the time unit in the International Standart.

A simple pendulum performs a constant number of cycles in a day. So we conclude that if the day is constant, so is the pendulum, and the tick on the clock, and the circuit in the watch, etc.

The unit for lenght (meter) was just an arbitrary value. But after special relativity it was changed a little bit, so that the speed of light in the vaccum (which is constant) would have a value with exact precision.

Is the speed of light really constant? Is the day really constant? We must be careful not to get phylosofical when asking these questions.

Science is based on postulates. At some point we must assume that something is true with no explanation other than experimental observation.
Ha!

What rubbish!

I can build any number of entropy meters in five minutes.

Entropy is so pervasive, its a piece of cake!

The simplest meter would be anything that measures volume!

S = R ln (Vf / Vi)

Vf = final volume
Vi = initial volme.
S = entropy
R is constant

You obviously don't understand that as simple a thing as doubling a volume results in twice as many states!





Fifty cents worth of signal conditioning circuitry and I can tell you exactly what the mixing tendency of two chemical concetrations will be.  ln(A2/A1). Hardly rocket science.  ANYTHING that measures the concentration of a chemical can be quickly turned into an entropy meter.
The entropy zero at absolute zero DEFINES the temperature scale. A scale is just that, a scale.  But you can dig into the nucleus of an atom and define a nonthermal entropy.  Why not.  That doesn't invalidate other measures.
Heat from food is nothing like passing current through a wire, it is a quite deliberate biological STRATEGY, not an accident.  First lets get muscular activity out of the way - that produces heat from the ATP reaction (not friction or resistance of course).  

So what is left? You may not realise it, but up to 70% of the resting energy requirement of humans must come from fats.  Most energy metabolism takes place inside the matrix of the cell mitochondria.  What actually happens is that electro-chemical reactions use proton pumping to set up a massive electrical potential of 150mv across this tiny membrane, which means the proteins have to operate under an electrical stress of 30,000,000 volts/meter.  This electrical gradient acts as a chemical factory ---- it is able to reverse unfavourable reactions and thereby synthesise ATP for muscular activity, but note this ---
- the mitochondrial uncoupling proteins UCP1, UCP2 and UCP3 divert some of the energy into heat DELIBERATELY.

This can be controlled by hormones and sensors elsewhere in the body.  The higher temperature of warmblooded animals is necessary so that we can export entropy more effectively.  We need to remove excess entropy so we can maintain our internal structures in lower entropy states.
This is particularly true of big-brained animals, where structural maintenance is the big issue.

Your brain is not building structure for free.  Our sensory neurons are calibrated to regulate our entropy export capability.  The heat can be exported by various means, but the temperature is the most problematical part of entropy - so the sensors make you shiver, or sweat.  

The point is, you ARE an entropy measuring/responding device.