Submitted by Zalack t3_11x4f9t in askscience
Chemomechanics t1_jd1ton2 wrote
>Plasma seems intuitive because you are stripping pieces of the atom away, but what about the three basic phases?
Whether a simple material is a solid, liquid, or gas at equilibrium depends on which phase has the lowest Gibbs free energy at that temperature, pressure, and other conditions.
Nature prefers both strong bonding and high entropy, and the Gibbs free energy incorporates both as a tradeoff: It's the enthalpy minus the temperature multiplied by the entropy. This is why the higher-entropy phase always wins at higher temperatures: solid to liquid to gas. Visualization.
Thermodynamic entropy in this context is an ensemble property that isn't well defined for a single atom, so it doesn't make sense to talk about a single atom having a certain equilibrium phase.
LoyalSol t1_jd417r3 wrote
One bit of nitpick. Entropy is still very well defined even at the atomic level. A lot of excitation phenomena are dictated by entropy.
There's many different types of entropy, but they all are related to the same underlying concept.
It's one of the few bulk properties that actually has a near one to one correspondence to it's micro scale counterpart.
PercussiveRussel t1_jd4hspw wrote
Can you point me to specific books or papers (or even terms) that clarify this further, because from my thermodynamics and stat-phys (and I guess solid-state) knowledge I would definitely call entropy an ensemble property (I'd call it the ensemble property).
I'd guess that you could be talking about mixed-state density matrices, but even that would involve multiple objects, no?
LoyalSol t1_jd4r58z wrote
So to be a bit careful about how we go about defining things. Yes entropy will still be directly tied to an ensemble in that it is directly related to the probability of an observation. Probability of course being tied to thousands of observations. But the key is that entropy can be observed in any type of probabilistic system and will very often behave the same way in a system with millions of atoms or a system of a single particle. It will just be tied to different averages such as the time average, spacial average, etc.
Where entropy is distinguished between many other bulk properties is that the later are often the result of thousands of atoms acting in unison where as entropy can be observed even in a single particle system. It's especially true when talking about quantum descriptions of molecules.
For a single particle the Jacobian of the principle coordinate is the entropy term.
Say for example you have a classical particle who is attracted to a single point by the equation
E(r) = 1/2 * k * (r-r0)^2
In this system we can simply write the Jacobian as a function of r. For an N-dimensional system
J(r) = r^(N-1)
Assuming we integrate the angular terms out. If you perform a simulation of the particle with a given momentum. One of the things of course in a system with conserved momentum is that while the lowest energy position is a distance from the center r0, the time average position will only be r0 if we perform the simulation in 1 dimension. If we have two dimensions you will notice the value will be some value above r0. And as we add more and more dimensions the particle will deviate more and more from r0 outwards. That is because as you increase the number of accessible dimensions you increase the translational entropy. A hyper-dimensional particle will spend very little time near r0 despite r0 being the most stable position.
You don't need multiple equivalent systems to observe this. The time average of a single particle will give rise to this.
In statistical mechanics and such we usually define these in terms of a number of equivalent systems because in practice that's what we are typically measuring and we take advantage of the ergodic hypothesis to link the time average to other averages of interest. But the thing about entropic effects is that they show up even in atomic and sub-atomic systems and many behaviors are a direct result of it. For example if an electron can be excited to a higher set of orbitals where all the orbital is the same energy and one orbital has more momentum numbers than another sub-orbital that orbital will be preferred simply because there's more combinations that suborbital has.
Larger systems have more degrees of entropy they can take advantage of such as swap entropy, rotational entropy, etc. but the rules and interpretations are still very much the same no matter if you got 1 million particules or just one. That's not always the case for other bulk properties. Sometimes the bulk properties are only observable in the limit of the average and not on a single particle.
PercussiveRussel t1_jd50td1 wrote
Ah yes, this helps a lot. Brings back a lot of statphys memories too. Thank you very much.
In a way, a time averaged system could be described as a mixed-state density matrix I suppose, which is where my intuition comes back again. I always picture a single object as being in a pure state, but there are ways it doesn't have to be.
Because when you say that entropy is tied to the probability of an observation, that really doesn't hold for an object in a superposition, since its multiplicity of states is just 1 (the superposition itself), which is where we do need to be careful I guess. I'd call it classical probabilistic, and avoid all confusion with quantum probabilistic.
So, to get more philosophical: It feels like there needs to be some sort of "outside influence" on a single particle for it to have entropy. Would you agree with this line of thinking? For some definition of outside influence.
That is not me trying to say my intuition was right by the way, it wasn't.
LoyalSol t1_jd586m6 wrote
>Because when you say that entropy is tied to the probability of an observation, that really doesn't hold for an object in a superposition, since its multiplicity of states is just 1 (the superposition itself), which is where we do need to be careful I guess. I'd call it classical probabilistic, and avoid all confusion with quantum probabilistic.
It gets a little strange in quantum, but you still have entropy effects there. But yeah it gets kind of harry just because super positions themselves are already strange to behind with.
It's been a while since I focused on quantum stuff so I won't go too much into those since I'll probably get myself into trouble. :)
>So, to get more philosophical: It feels like there needs to be some sort of "outside influence" on a single particle for it to have entropy. Would you agree with this line of thinking? For some definition of outside influence.
It's easier to understand with an outside influence, but even in the situation of say a classical particle in a box where all points in the box are equally probable, the more dimensions you have the less likely you will observe a particle in the center of the box. Simply because there is more area toward the edge of a hyper-cube than in the center and this effect grows with dimensions.
I guess we could say the box is an outside influence, but I guess we wouldn't have a system without any constraints what so ever? I would have to think about that.
For an isolated particle the volume of the space it occupies is where it gets it's entropy from. Even for a quantum particle in a box the trend is also true, but just not uniform since you have a wave function. The odds of observing a particle near the center of the box goes to 0 as the number of dimensions increases. You're more likely to observe it near the edge in higher dimensions.
Which also a bit of trivia, is why the translational partition term is usually the only one in statistical mechanics that has a volume component. Because the other forms of entropy deal with internal degrees of freedom where as translational entropy is the space of the system.
Chemomechanics t1_jd4pbix wrote
> One bit of nitpick. Entropy is still very well defined even at the atomic level. There's many different types of entropy, but they all are related to the same underlying concept.
Isn't it clear from the context that I'm referring to the thermodynamic entropy as applied to ensembles of molecules to determine the equilibrium bulk state?
LoyalSol t1_jd4w78f wrote
Yes, but even that is still one to one correspondence with the partition function which is the number of accessible states.
The thermodynamic entropy is actually defined well at the atomic level. Where as many other properties only exist in the bulk limit.
Chemomechanics t1_jd5cot7 wrote
Sorry, I don’t see how this helps the OP. It sounds like you’re talking about looking at the behavior and any transitions over a very long time rather than relying on the ergodic hypothesis and stat mech assumptions based on large N. OK, so now you’ve calculated what you consider the entropy. I don’t get how this allows the OP to classify the atom as a bulk solid, liquid, or gas when it’s a lone aqueous atom.
LoyalSol t1_jd5zils wrote
I made the point about time to show how you can prove the entropy exists in the absence of large N. It is not exclusive to that however.
I was not making a point about the OP question as much as clearing up a statement that entropy is a strictly bulk property when it isn't. Unlike many thermodynamic definitions entropy is actually defined the same way on both a macro and micro scale.
Thus why I called it a nitpick.
[deleted] t1_jd51axh wrote
[removed]
[deleted] t1_jd2c1yx wrote
[deleted]
Viewing a single comment thread. View all comments