Submitted by Zalack t3_11x4f9t in askscience

From a totally naive point of view it seems like whether matter is a solid, liquid or gas largely has to do with how those atoms behave as a group.

If you have a single atom of uranium suspended in water at the right pressure and temperature for it to be solid, is it a solid? Is there anything that differentiates it from a single atom of the same material in space, heated to the point where it could be a liquid or gas in the presence of other uranium atoms?

Plasma seems intuitive because you are stripping pieces of the atom away, but what about the three basic phases?

Thank you for your time!

1,598

Comments

You must log in or register to comment.

westernguy339 t1_jd1qpnp wrote

No actually. Phases of matter really are how that matter behaves in relation to itself. A solid liquid or gas can only be defined because of the relationships atoms have with one another. A single uranium atom in water is a liquid, in air its a gas, and in a rock is a solid.

1,577

Chemomechanics t1_jd1ton2 wrote

>Plasma seems intuitive because you are stripping pieces of the atom away, but what about the three basic phases?

Whether a simple material is a solid, liquid, or gas at equilibrium depends on which phase has the lowest Gibbs free energy at that temperature, pressure, and other conditions.

Nature prefers both strong bonding and high entropy, and the Gibbs free energy incorporates both as a tradeoff: It's the enthalpy minus the temperature multiplied by the entropy. This is why the higher-entropy phase always wins at higher temperatures: solid to liquid to gas. Visualization.

Thermodynamic entropy in this context is an ensemble property that isn't well defined for a single atom, so it doesn't make sense to talk about a single atom having a certain equilibrium phase.

104

luckyluke193 t1_jd22te9 wrote

> From a totally naive point of view it seems like whether matter is a solid, liquid or gas largely has to do with how those atoms behave as a group.

That's correct, phases of matter are properties of large groups of atoms.

Your example of a uranium atom suspended in other matter might not be the best to make this point, because mixtures of different substances can make things more complicated. For example, pure sugar at room temperature is a solid. Add it to a cup of water, and it's a liquid solution. Add much, much more sugar and you get phase separation with some solid and some sirup-y liquid.

153

Neurogence t1_jd2y3te wrote

Hey dude. You are correct that the phase of matter (solid, liquid, or gas) is largely determined by how the atoms or molecules interact with one another as a group. The interactions are driven by factors such as temperature, pressure, and intermolecular forces.

When considering a single atom of uranium suspended in water, the concept of phases is not applicable in the same way as it would be for a macroscopic sample of uranium. This is because phases are macroscopic properties that emerge from the collective behavior of a large number of atoms or molecules. A single atom does not exhibit a phase by itself, as the phase is a result of interactions between atoms or molecules.

To answer your second question, the difference between a single uranium atom suspended in water and a single uranium atom in space would be their surrounding environment and how they interact with it. In water, the uranium atom would interact with the water molecules and any other impurities present. In space, it might interact with cosmic rays, other atoms, or molecules depending on its location. However, neither of these situations would qualify the uranium atom to be classified as a solid, liquid, or gas, as these phases emerge from the collective behavior of many atoms or molecules.

Plasma, as you mentioned, is another state of matter in which atoms are ionized, meaning their electrons are stripped away, and this occurs at high temperatures or under intense electromagnetic fields. This state is distinct from solids, liquids, or gases, which involve neutral atoms or molecules.

So, phases (solid, liquid, or gas) are macroscopic properties that arise from the collective behavior and interactions of a large number of atoms or molecules. A single atom does not exhibit a phase on its own.

33

Hunangren t1_jd35304 wrote

No, it can't. Phases of matter are a description of some emerging properties derived by the collective behaviour of a large ensamble of atoms (or, more in general, particles). By definition such properties have no meaning in describing a single (or a few) particles.

To familiarise with the concept of collective behaviour think about yourself: you too are a collection of cells that have some emerging properties that no one single part of yourself have. For example, you can be "hungry", "tired" or "sad"; although there is no meaning in asking if any particular cell of your body is "hungry", "tired" or "sad".

The same is true for a collection of atoms. A crystal is solid, but there is no sense in calling every single atom in the crystal solid or not. A single uranium atom in a liquid is neither liquid nor solid: it is part of a liquid.

Talking about the plasma, that's a fun topic: in a sense, there are multiple orders of state of matter, describing different type of particles describing collecting behaviours. The three "canonical" state of matter (solid, liquid and gas) describe the behaviour of ensables of atoms. You can extend the logic to subatomic particles, though, obtaining that neutral atoms are the analogue to solid while plasma is the analogue of gas; or even to macroscopic scale, obtaining that "sand" is a macroscopic state of an ensable of pebbles which has a viscosity and the ability to occupy any volume, as opposed to "crystal", which is impenetrable and rigid. I suggest you the video from PBS Space Time "How Many States Of Matter Are There?" that you can find on youtube. It's really enlighting about the matter, ;)

8

Busterwasmycat t1_jd37dd3 wrote

The physical behavior you call "phase" is that of the group or bulk mass. You have the correct understanding that the individual atom has no definite state of matter, because the state of matter is not a characteristic of the individual. The same atom can be in differ phase or state because it depends on what everything else is doing and how the one atom interacts with its neighbors.

Often, the physical state (whether sold, liquid, gas) concerns compounds rather than single atoms, so not only is the single atom part of a particular compound, it is that that compound which displays a particular state of matter that depends on its conditions and what else is present and interacting with the compound.

1

FourChanneI t1_jd3keo6 wrote

Tossing my hat into the ring as a negative novice, my guess is that it would have to do with its interactions with other atoms while in various phases. Take an atom floating in space, what is it? Just an atom, until it interacts with a solid or plasma, gas, liquid, etc. Then what it is changes based upon that principle?

1

Cheetahs_never_win t1_jd3oga1 wrote

A singular atom traversing the vacuum of space, belting out Bobby Vinton? No. Liquids, gases, and solids are functions of temperature and pressure, which are defined by proximity to other atoms.

If you have one molecule of every (non-reactive) gas possible sharing space in a teeny tiny pressure vessel, it could still be deemed a gas mixture if it doesn't sublimate or condense. We just don't have the kinematic equations to describe how that mixture works in a specific sense, just that it still acts like a gas in a general sense.

CO2 is still a gas entrained in soda, even if the H2O molecules separate each CO2 molecule out by a thousand miles. We say this because if we agitate all million square miles, the CO2 comes out and floats off with the rest of the gases.

1

dalnot t1_jd3rsxv wrote

It’s not really any specific form of matter because of the reasons elsewhere in this thread, but a single atom is certainly most similar to gaseous state. This is because gases are the state where the atoms interact the least with each other. In outer space, it’s mostly individual atoms flying around, but we still call it a gas, just an incredibly dilute one

5

nobody_in_here t1_jd408q9 wrote

Don't mean to hijack the post, OP has a great question, but their question made me want to ask something similar: salt, like let's say sodium chloride, from what I understand it dissociates into it's consituent ions when in water. Like it becomes free Na and Cl just floating around in water right? Would that mean if you saw free Na and free Cl ions swimming around, and they're not bonding, you could assume it's a liquid or no?

2

LoyalSol t1_jd417r3 wrote

One bit of nitpick. Entropy is still very well defined even at the atomic level. A lot of excitation phenomena are dictated by entropy.

There's many different types of entropy, but they all are related to the same underlying concept.

It's one of the few bulk properties that actually has a near one to one correspondence to it's micro scale counterpart.

23

SatanScotty t1_jd4dere wrote

Could you measure the kinetic energy of the entire atom and estimate “this atom has a level of energy consistent with a solid”?

Could you also note whether or not it’s ionized in a way consistent with plasma?

107

istasber t1_jd4fe6y wrote

Atoms (at least atoms larger than beryllium, give or take) are basically a classical particle for all intents and purposes. They have momentum (assuming non-zero temperature) and mass, and basically just keep flying in a direction until they hit something or a force acts on it to pull it in a new direction.

In a solid, the interactions with nearby atoms (through e.g. electrostatic interactions) and the degree to which the atoms are packed mean the ball's basically just vibrating in place.

In a molecule, "bonds" are just forces resulting from electrons being shared that makes it really tough to pull the atoms apart, but they are still basically just balls moving in a direction until they bounce into something, or a force pulls them in another direction.

There's some quantum weirdness about the nature of the forces themselves, but atoms generally behave F=ma just the same as macroscopic stuff.

3

Jasmisne t1_jd4g988 wrote

When you are studying chemistry, in quantum mechanics we have this thought experiment/math workthrough called 'molecule in a box.'

Basically if one hydrogen atom in a box with nothing else, then you only have to deal with the physics of that one atom bouncing off the side of the box.

Now we said one H atom, one proton, one neutron, one electron. Your example of uranium is a problem because when talking about every molecule, we have two groups of forces- the ones between it and the world and the ones between itself. Uranium is not stable on its own, and is undergoing a tremendous amount of force within itself, those are a lot of different protons and electrons and neutrons, that all have forces on each other.

So short answer, no, long answer, no again but because it is infinitely more complex and even when we are examining a simple scenario we are ignoring factors simply because the dynamics of molecules are way way more complicated than solid liquid and gas.

2

PercussiveRussel t1_jd4hspw wrote

Can you point me to specific books or papers (or even terms) that clarify this further, because from my thermodynamics and stat-phys (and I guess solid-state) knowledge I would definitely call entropy an ensemble property (I'd call it the ensemble property).

I'd guess that you could be talking about mixed-state density matrices, but even that would involve multiple objects, no?

3

cracksmack85 t1_jd4k75m wrote

Semi-related question - when I took gen chem in college, if a solid was dissolved in a liquid it was always denoted as “aqueous” and we were told to treat it as a liquid, but it was never really clear to me whether that substance is a liquid or a solid. Can you explain the “aqueous” designation?

12

lizardweenie t1_jd4l91d wrote

An atom has internal degrees of freedom due to electronic and spin transitions, so it can certainly be excited. In general though as previous users mentioned, temperature is an ensemble property, so the atom wouldn't have a well defined temperature.

60

[deleted] t1_jd4mf0y wrote

Aqueous here means that the solution which constitutes a solvent and solute, has water as the chosen solvent.

Example being Sodium Chloride. Sodium Chloride (aq) simply means that you have a saline solution. The individual atoms of Sodium Chloride are dissolved in liquid water.

I wouldn't really call NaCl a liquid if you have NaCl (aq) because you're no longer dealing with pure NaCl. The aqueous solution is a liquid though.

This leads to an interesting question about solids that do not dissolve in a liquid. Here you're dealing with a type of colloid (effectively one phase suspended in another phase) And the dispersed compound does not have to be solid btw- it can be liquid (such as milk).

19

RestlessARBIT3R t1_jd4n0l2 wrote

Aqueous just means that it’s dissolved in a liquid. Ionic things dissolved in a liquid get covered in water molecules because the water molecules are polar and the ions are charged. Non-ionic things dissolved in liquid are usually just polar and can form hydrogen bonds with the water molecules so that it looks like a single solution as opposed to when things don’t dissolve

1

glurth t1_jd4n4io wrote

"You need internal degrees of freedom to define temperature."

This sounds inaccurate. Wouldn't the excitement states of a single atom's electrons be an internal degree of freedom?

8

Chemomechanics t1_jd4pbix wrote

> One bit of nitpick. Entropy is still very well defined even at the atomic level. There's many different types of entropy, but they all are related to the same underlying concept.

Isn't it clear from the context that I'm referring to the thermodynamic entropy as applied to ensembles of molecules to determine the equilibrium bulk state?

−1

Chemomechanics t1_jd4ptcm wrote

> Like it becomes free Na and Cl just floating around in water right?

They aren't free, they're solvated—that is, they're stabilized through interactions with the surrounding solvent. This is why they don't immediately bond, so we can't remove this crucial aspect.

3

LoyalSol t1_jd4r58z wrote

So to be a bit careful about how we go about defining things. Yes entropy will still be directly tied to an ensemble in that it is directly related to the probability of an observation. Probability of course being tied to thousands of observations. But the key is that entropy can be observed in any type of probabilistic system and will very often behave the same way in a system with millions of atoms or a system of a single particle. It will just be tied to different averages such as the time average, spacial average, etc.

Where entropy is distinguished between many other bulk properties is that the later are often the result of thousands of atoms acting in unison where as entropy can be observed even in a single particle system. It's especially true when talking about quantum descriptions of molecules.

For a single particle the Jacobian of the principle coordinate is the entropy term.

Say for example you have a classical particle who is attracted to a single point by the equation

E(r) = 1/2 * k * (r-r0)^2

In this system we can simply write the Jacobian as a function of r. For an N-dimensional system

J(r) = r^(N-1)

Assuming we integrate the angular terms out. If you perform a simulation of the particle with a given momentum. One of the things of course in a system with conserved momentum is that while the lowest energy position is a distance from the center r0, the time average position will only be r0 if we perform the simulation in 1 dimension. If we have two dimensions you will notice the value will be some value above r0. And as we add more and more dimensions the particle will deviate more and more from r0 outwards. That is because as you increase the number of accessible dimensions you increase the translational entropy. A hyper-dimensional particle will spend very little time near r0 despite r0 being the most stable position.

You don't need multiple equivalent systems to observe this. The time average of a single particle will give rise to this.

In statistical mechanics and such we usually define these in terms of a number of equivalent systems because in practice that's what we are typically measuring and we take advantage of the ergodic hypothesis to link the time average to other averages of interest. But the thing about entropic effects is that they show up even in atomic and sub-atomic systems and many behaviors are a direct result of it. For example if an electron can be excited to a higher set of orbitals where all the orbital is the same energy and one orbital has more momentum numbers than another sub-orbital that orbital will be preferred simply because there's more combinations that suborbital has.

Larger systems have more degrees of entropy they can take advantage of such as swap entropy, rotational entropy, etc. but the rules and interpretations are still very much the same no matter if you got 1 million particules or just one. That's not always the case for other bulk properties. Sometimes the bulk properties are only observable in the limit of the average and not on a single particle.

13

florinandrei t1_jd4ubra wrote

Technically, you can associate a temperature to the velocity of the atom measured relative to the container, and therefore obtain a "temperature" for that atom. But a lot of concepts become quite strained when you reduce things to single atoms, and temperature is one of them. A single atom does not have a temperature in the normal sense.

To your initial question: the phases of matter are only defined for molecular or atomic collectives. Single molecules or atoms do not have a clearly defined phase of aggregation. Even for large molecular collectives it is not always clear whether they are solid, liquid, or gas. For example, on geologic time scales, even some "solids" can flow.

The phases of matter are more like convenience concepts. We use them to simplify discussions that would otherwise be complex. There's nothing fundamental about them. Do not get stuck in rigid categorizations there, because there's no point in doing that.

112

avoid3d t1_jd4urxm wrote

Hmm, in physics we learned that a more nuanced way of reasoning about temperature is relating it to the change in entropy as heat is added.

If I understand you correctly you are arguing that heat cannot be added to a single atom since there are no inter molecular forces to create oscillations to store the heat.

I’d argue that heat can be added since there are other kinds of energy states that are possible in a single atom such as electric phenomena.

Is there something I’m misunderstanding?

edit This lovely commenter explains this topic very well:

https://www.reddit.com/r/askscience/comments/11x4f9t/comment/jd4r58z/

13

Fanburn t1_jd4vekw wrote

A grain of sand is just that, a grain of sand. Two gains of sand are two grains of sand.

If you add more and more sand, at some point you can say you have a pile of sand, and you can describe it with new properties.

Atoms are basically the same, you need a bunch of them and then you can describe them with new properties such as viscosity, state of matter and so on.

0

LoyalSol t1_jd4w78f wrote

Yes, but even that is still one to one correspondence with the partition function which is the number of accessible states.

The thermodynamic entropy is actually defined well at the atomic level. Where as many other properties only exist in the bulk limit.

5

RevengencerAlf t1_jd4xdut wrote

I think this is both true and kind of not and it gets weirdly philosophical. It doesn't have temperature as we're taught about it in HS physics class, sure, since that is generally the internal kinetic energy of molecules vibrating and bumping into each other, but atoms themselves have internal degrees of freedom at the quantum level that can reasonably be used to describe temperature. The excitement state of an atom's electrons is the most obvious one.

9

PercussiveRussel t1_jd50td1 wrote

Ah yes, this helps a lot. Brings back a lot of statphys memories too. Thank you very much.

In a way, a time averaged system could be described as a mixed-state density matrix I suppose, which is where my intuition comes back again. I always picture a single object as being in a pure state, but there are ways it doesn't have to be.

Because when you say that entropy is tied to the probability of an observation, that really doesn't hold for an object in a superposition, since its multiplicity of states is just 1 (the superposition itself), which is where we do need to be careful I guess. I'd call it classical probabilistic, and avoid all confusion with quantum probabilistic.

So, to get more philosophical: It feels like there needs to be some sort of "outside influence" on a single particle for it to have entropy. Would you agree with this line of thinking? For some definition of outside influence.

That is not me trying to say my intuition was right by the way, it wasn't.

5

beansahol t1_jd516nl wrote

Nah, for covalent the state is going to be determined by the strength of Van der Waals forces. For ionic compounds the mp and bp will relate to the electronegativity...distance and shielding making it weaker. For metals it must be the charge of the cation. Technically, if you had one atom, it has none of these intermolecular, covalent or ionic forces at play, so you could call it a gas, at an incredibly low concentration.

5

RedditAtWorkIsBad t1_jd51cph wrote

And to add the comment about how velocity is relative, even if you have a large mass of material moving quickly, this doesn't make it hotter. So, velocity isn't by itself the metric you need but variance in velocity, where velocity is a vector quantity. This way you can get a picture of the range of differences in velocity amongst the particles. Temperature is directly related to this (and would only be related to this for simple point masses that only react like billiard balls.)

3

MarzipanMission t1_jd520vi wrote

How is thermodynamic temperature different from viewing it from a kinetic perspective?

Does that mean that the movement of atoms relative to each other, in the kinetic sense of temperature, is not what the temperatures talk about in thermodynamics? So temperature is not a universal concept then? It is context dependent, and has many definitions?

3

octonus t1_jd54mix wrote

We discuss states of matter in terms of how a molecule interacts with its neighbors. If the solvent is a liquid (water in the case of aqueous solution), all interactions that you would care about are liquid like. In the case of a solid solution ie. bronze they would be solid-like.

The key reason we note it as being a solution rather than a liquid is to point out that the neighbors a molecule interacts with are the solvent, rather than molecules of the same type.

2

Solesaver t1_jd578qg wrote

One other wrench to throw into this thesis is that phase of matter is actually determined by a combination of temperature and pressure. The pressure of a single atom is also not a meaningful metric.

2

LoyalSol t1_jd586m6 wrote

>Because when you say that entropy is tied to the probability of an observation, that really doesn't hold for an object in a superposition, since its multiplicity of states is just 1 (the superposition itself), which is where we do need to be careful I guess. I'd call it classical probabilistic, and avoid all confusion with quantum probabilistic.

It gets a little strange in quantum, but you still have entropy effects there. But yeah it gets kind of harry just because super positions themselves are already strange to behind with.

It's been a while since I focused on quantum stuff so I won't go too much into those since I'll probably get myself into trouble. :)

>So, to get more philosophical: It feels like there needs to be some sort of "outside influence" on a single particle for it to have entropy. Would you agree with this line of thinking? For some definition of outside influence.

It's easier to understand with an outside influence, but even in the situation of say a classical particle in a box where all points in the box are equally probable, the more dimensions you have the less likely you will observe a particle in the center of the box. Simply because there is more area toward the edge of a hyper-cube than in the center and this effect grows with dimensions.

I guess we could say the box is an outside influence, but I guess we wouldn't have a system without any constraints what so ever? I would have to think about that.

For an isolated particle the volume of the space it occupies is where it gets it's entropy from. Even for a quantum particle in a box the trend is also true, but just not uniform since you have a wave function. The odds of observing a particle near the center of the box goes to 0 as the number of dimensions increases. You're more likely to observe it near the edge in higher dimensions.

Which also a bit of trivia, is why the translational partition term is usually the only one in statistical mechanics that has a volume component. Because the other forms of entropy deal with internal degrees of freedom where as translational entropy is the space of the system.

3

sticklebat t1_jd58ks6 wrote

Thermodynamic temperature is defined as the rate at which the internal energy of a system changes as its entropy changes.

In contrast, temperature from kinetic theory is essentially a measure of the average translational kinetic energy of the particles in a system.

The two are sometimes, but not typically, equal. The temperature that you know and love is the second one, but thermodynamic temperature is also widely used in science.

4

garrettj100 t1_jd59lci wrote

Another issue is the energy of an atom doesn’t determine its temperature. Not exactly.

The high school definition of temperature as the average kinetic energy of the particles is merely an approximation appropriate only for gasses. Thusly the “ideal gas law”.

It’s better to think of temperature as a thermodynamic arrow. Heat flows from higher temperatures to lower ones. The rigorous definition of temperature is the inverse of the derivative of the entropy with respect to energy:

T = 1 / δS/δE

As you add more energy to a system, it gets more entropy, but because entropy is logarithmic it grows slower. So the derivative gets smaller. Thusly the temperature rises.

The flow of energy from high temperatures to low temperatures means that total entropy rises, because the system with lower temperature gains more entropy from the infinitesimal of energy. It’s how the universe obeys the second law of thermodynamics: Entropy always increases.

2

8426578456985 t1_jd5aonm wrote

I want to say no, but relative to what? Say the jar is floating in space with no radiation hitting it and the hydrogen atom is just teleported into the jar so their relative speeds are zero at the start.

2

Chemomechanics t1_jd5cot7 wrote

Sorry, I don’t see how this helps the OP. It sounds like you’re talking about looking at the behavior and any transitions over a very long time rather than relying on the ergodic hypothesis and stat mech assumptions based on large N. OK, so now you’ve calculated what you consider the entropy. I don’t get how this allows the OP to classify the atom as a bulk solid, liquid, or gas when it’s a lone aqueous atom.

0

Dr-Luemmler t1_jd5f45s wrote

Ehm, what? I know what you are saying, but just because you need some kind of interaction to measure ANYTHING. Or in other words with that logic you couldnt even measure the impuls of a flying particle because to measure it, the particle would need to interact with another particle somehow. In a simulation you for example could measure the energy level of a single particle and then determine its state. So here for md. For dft simulations you could also use the electron probability densities to determine the distance to other particles.

This kind of access to the physics are not availible for a single atom ofc

2

Dr-Luemmler t1_jd5fsbi wrote

Maybe I dont get what you are saying about temperature, but what you are saying doesnt make sense to me. If a single atom wouldnt have a temperature, because it cant have a velocity alone, what happens if we drop a second atom in the void? Does now (kinetic) energy spawn from nothing? Besides that, temperature itself isnt relative as we have a true zero. Even if it is just theoretical.

5

TheArmitage t1_jd5hp7t wrote

>what happens if we drop a second atom in the void?

In doing so, you've introduced energy into the system. That atom had to get there somehow, and that takes energy.

>Besides that, temperature itself isnt relative as we have a true zero.

Yes, it is. It's just self-referential. Thermal motion is the motion of atoms in a substance relative to each other. So if all atoms in a substance have zero motion relative to each other, it has a temperature of 0K.

6

Dr-Luemmler t1_jd5io43 wrote

Thats my point. Ofc, in a labratory you need a reference to measure the velocity of a single atom. The reference frame obviously can be broken down to other atoms if you want, but that doesnt mean a single atom cant have kinetic energy by itself.

2

yakbrine t1_jd5k2pr wrote

His point to my understanding is that is kind of the point. There’s probably tons of variables like this for every solid and liquid. And the sole fact they are solid or liquid does not give them said properties or everything would be identical. The point being everything is extremely nuanced and we’ve created these categorizations so we don’t have to define everything as a mathematical equation instead of ‘solid gas liquid’

17

Acewasalwaysanoption t1_jd5ll20 wrote

Nobody said that we can't have a reference point, just that we have a single atom of an element, as opposed to a macroworld-sized amount to easily determine its phase.

Like if I'm the last person on the world, I can't tell if I'm handsome or if I'm rich, without other people to compare myself. But I know how fast I am, because I don't need other people for a reference system.

3

Acewasalwaysanoption t1_jd5oj3o wrote

Sorry, I may have misread something.

New question: what you exactly mean by "compared to itself"? It can't be literally itself in the same state, as it would be the same, all the time. Can't be a chunk of the material, or any material that has the same temperature in its core and surface would be at 0 difference and...incomperable?

Also, using thermometers isn't using an external point if reference in general? Originally nercury's change in volume to tell a completely different material's temperature. Works because energy transfer.

1

my58vw t1_jd5szy4 wrote

An atom has kinetic energy due to the random movement of electrons within the atom. The movement of electrons are theoretically relative to the nucleus, and single KE is a function of mass and velocity, the components of atoms have a speed and this a temperature. A self contained particle nearing absolute zero could again be stationary.

Temperature in a normal situation relies are other particles, but then as others have said breaks standard conventions of physics and chemistry

0

willowsword t1_jd5v2pe wrote

As others have said, no. Phase, like temperature, is an emergent property, which means that it appears upon interaction with others. Like the saying, "the whole is greater than the sum of its parts." https://en.m.wikipedia.org/wiki/Emergence

Extra info no one asked for:

Non-linear pattern formation is a branch of soft condensed matter physics that studies emergent patterns. Look up books by Philip Ball. Also the prof from the increasing-size dominoes meme, Dr. Stephen W. Morris, researched in this area before his recent retirement.

1

ashpens t1_jd5ze5j wrote

Sort of similar concept to cells and whether or not they constitute a tissue or an organism. If you're zoomed all the way in on one cell in isolation, it loses the relativity you would need to determine higher classifications.

1

LoyalSol t1_jd5zils wrote

I made the point about time to show how you can prove the entropy exists in the absence of large N. It is not exclusive to that however.

I was not making a point about the OP question as much as clearing up a statement that entropy is a strictly bulk property when it isn't. Unlike many thermodynamic definitions entropy is actually defined the same way on both a macro and micro scale.

Thus why I called it a nitpick.

1

sudomatrix t1_jd6007m wrote

If a single atom does not have a 'temperature' or a 'state of matter' but only the interaction between atoms has these properties... then where is the temperature "stored" in the atom? How does an atom "know" from one instant to the next what its temperature is?

1

florinandrei t1_jd6br1i wrote

> they exhibit different physical properties including changes in electrical conductance

Of course they do. I'm just saying - the borders between them are far more fuzzy than most people imagine.

E.g. consider the changes that occur in tar or pitch when cooled from the boiling point of water to the boiling point of nitrogen. It's liquid at one end. It's solid at the other. The changes are smooth, without any sharp transitions.

7

florinandrei t1_jd6bwq6 wrote

It definitely does have a kinetic energy.

The only thing is - when you go from kinetic energy to temperature, you run into all sorts of trouble if you do it for single entities.

Temperature is an inherently collective measure. If it's single particles, stick to kinetic energy.

What is the "temperature" of this marble I'm throwing? ;) (not the temperature of the glass, but the "temperature" of the marble as a single particle with some kinetic energy)

2

Purplestripes8 t1_jd6moch wrote

By compared to itself it means the motions of the atoms within an object relative to each other. The object itself can have any velocity depending on the observer but no matter which direction it's moving as a whole or how fast, the atoms within still have the same motions relative to each other, which is signified by temperature.

1

KarlSethMoran t1_jd6vmwt wrote

I don't get what you mean by "out of phase". Gravity is exceedingly weak at the atomic scale, it can be safely ignored.

Atoms feel van der Waals attraction. It's a very weak interaction, but billions of billions times stronger than gravity at this scale still. It will get even noble gases into a crystal when there's sufficiently little motion.

2

Dr-Luemmler t1_jd75stk wrote

Defining temperature by kinetic energy, you could calculate it for a single marble. If you want to use the advanced definition of temperature via entropy, sure lets do it:

$T = dE/dS $

So temperature is the change of internal energy when changing the entropy. In statistical thermodynamics, one can now define entropy by the number of availible states $\Omega$ with its degrees of freedom.

The degrees of freedom a single atom has are $3N-3$ = 0. That basically means, this atom only has the translation dofs and the electronic ones. Lets neglect the electronic ones (even though they might be important, as with then we might be able to measure the temperature) then the temperature of a single atom is solely defined by its kinetic energy.

Can we access it in labratory without using the interaction with other atoms? No! But in simulations we can. Or what kinds of problems do we have?

2

Dr-Luemmler t1_jd7c3ex wrote

>So temperature is not a universal concept then? It is context dependent, and has many definitions?

Yes it has, but the definitions are all different sides of the same coin. Or in other words, they add in complexity, but are more or less the same. They are not contradicting.

Temperature IS the average kinetic energy of a systems particles. Thats not just the classical definition that is also the result if you combine quantum theory and statistical thermodynamics. This kinetic energy just is not only translational but also rotation and vibration. A single atom though, does not have the degrees of freedom to rotate or vibrate. Besides its spin, but that is not important here.

There is another dof, and thats the electronic one. Yes, here energy can also be stored. Also probably neglectable here, but OP was very unprecise with his thought experiment here...

1

NeoRemnant t1_jd7lcn9 wrote

Can drops of water not be raindrops simply because they are each measured separately? The water droplet knows not of the rain.

Simply put; 1. A singular lonely atom cannot be heated as heat is a quantification of atomic relative Brownian movement (local interactions caused by relative atomic velocity) therefore heat cannot be transmitted in a vacuum. 2. Individual atoms with no interaction have null molecular density and so they are gaseous. 3. Pressure and temperature are functions of atomic density and momentum.

1

lizardweenie t1_jd92qsl wrote

As other posters have mentioned, temperature is a property of a distribution. It tells us the probability of populating an excitation of a given energy. This isn't up for debate, it's just a matter of definitions. If you want to come up with some new concept that is well defined for a single particle, that's cool, but temperature doesn't work for single particles.

0

lizardweenie t1_jd9xip3 wrote

I don't mean to be rude, but it really seems like you haven't learned about temperature in a rigorous way (Like you would in a statistical mechanics class). It sounds like you've at least had some sort of exposure to undergrad level quantum mechanics, which is great. But recognize that your knowledge may not apply to this, and consider taking a statistical physics class.

If you did take such a class, you would learn that beta (which is propositional to 1/T) can be defined in terms of the partition function of the system of interest, but the entire concept relies on having multiple particles, (not simply one particle that transitions from state to state).

1

Dr-Luemmler t1_jdaglj7 wrote

I dont want to be rude, but you just need different states the particle could be, which you get with the quatification of impulse for each direction and the electronic states. Having multiple undistinguishable particle, and measuring their states is just one way to calculate the partition function. Another one is to track the trajectory of a single particle. In other words, we just need different states with different probabilities. I see no reason why that would not hold for a single atom. Temperature itself is also not a relative measurement as you can also see temperature dependent radiation from only a single atom.

The temperature in thermodynamics is defined as $T = dE/dS$. As $S \approx log(\Omega)$, the amount of "accessible" states need to increase with increasing temperature to hold the first formula. As a single atom has three dofs, we fullfill it.

Sorry, I really see no reason you could be right. I have also studied a bit advanced statistical thermodynamics and wrote my BA in that field. But I can be wrong, I cant say I was excellent in that field and some years have past since then. Maybe you can give me some hints for proper literature.

1

lizardweenie t1_jdamc26 wrote

No worries, you're not being rude. As for references, this a matter of basic definitions so I'd recommend some good textbooks, depending on your background.

I'd say that Chandler's book is pretty good: (I used it at the beginning of my PhD) http://pcossgroup.xmu.edu.cn/old/users/xlu/group/courses/apc/imsm_chandler.pdf

If you're looking for a different perspective, I've heard good things about Reichl: "A Modern Course in Statistical Physics"

Fun fact about this statement: > the amount of "accessible" states need to increase with increasing temperature to hold the first formula

This need not be the case. In certain scenarios, you can actually obtain negative temperatures which are perfectly valid. https://en.wikipedia.org/wiki/Negative_temperature

2

lizardweenie t1_jdapyy8 wrote

I just thought of a reasonable thought experiment that might clarify your confusion:

Say you have a bath of non interacting hydrogen atoms (consider for a moment, only electronic excitation), and we are able to measure the state of each atom.

Say we measure this bath and find that f_0 fraction are in the ground electronic state E0, and f_1 are in the first excited state E1. We could then infer a temperature by comparing these populations to a Boltzmann distribution, which tells us the relative probability of finding an atom in a state at a given energy (for a given temperature). In this case temperature is a well defined and meaningful concept.

Now say instead that we have a single hydrogen atom, we measure its state, and we find that it's in the first excited state. What then is the temperature? If we try to infer a temperature from this, (using a Boltzmann distribution), we get -inf. Say instead we measure it, and it's in E0. In this case, our inferred temperature will be 0. So for this single atom system, any temperature that we try measure can only give two values, (0, or negative infinity). In this system, clearly temperature isn't behaving how we would like it to.

This troublesome result points to a larger problem with the question: asking "what is the probability distribution for state occupation" doesn't really work well for the example: the atom was measured and determined to be in state E1, its probability distribution is a delta function, which is an inherently non-thermal distribution.

2

Quantum_Quandry t1_jde90h1 wrote

A single atom most definitely cannot have kinetic energy all by itself. SR/GR makes it abundantly clear that you must have something to reference against to make a measurement, and the answer changes depending on which reference point you're using. This should be obvious to anyone who has driven a car. Let's say you have three cars, yourself going 50mph north, a second car ahead of you and to your left going 45mph north, and a third car going 50mph ahead of you headed south. You have to swerve left or right due to an obstacle ahead, which do you choose? Obviously you're going to swerve left, ignoring the velocity of your swerve itself, you're going to overtake the car on your left a a relative 5mph and if you go right you'd be moving 100mph relative. Or you could split the difference and drive directly into the obstacle which is going 50mph relative to you.

0