Está en la página 1de 14


Log In Sign Up

Physics Stack Exchange is a question and answer site for active researchers, academics
and students of physics. Join them; it only takes a minute:

Sign up

Here's how it works:

1. Anybody can ask a question

2. Anybody can answer

3. The best answers are voted up and rise to the top

Why does heat added to a system at a lower temperature

cause higher entropy increase?
up vote12down votefa vorite

Entropy is defined in my book as Δ S=QTΔ S=QT. To derive the formula it says that entropy
should be directly proportional to the heat energy as with more energy the particles would be
flying all over the place more rapidly. This makes sense. But then it says that entropy must be
inversely proportional to the temperature at which it is added. To be precise
Heat added to a system at a lower temperature causes higher entropy increase than heat added
to the same system at a higher temperature.
How does this make intuitive sense?

I found an answer here. I think this makes sense. Can anyone read it and tell me if it's correct or
Rough intuitive answer:

Adding heat to a system increases the system's total energy. This gives more kinetic energy to
distribute among the particles in the system, increasing the size of the system's phase space
and hence its entropy. However, since momentum is proportional to the square root of kinetic
energy, it is clear that the total momentum phase space cannot grow exponentially as you add
more kinetic energy; so the rate of growth of its logarithm must taper off. Since entropy is
proportional to the log of the size of the phase space, we see that each additional unit of heat
must increase the entropy by less and less.

Now, for technical details. The reason why the relationship is

precisely dS=δq/TdS=δq/T rather than some more complicated relationship that depends on
the nature of the system is that temperature is defined such that this is true. We know that
during first-order phase transitions, the temperature actually stays constant as you add heat!
This corresponds to a region where adding heat actually does cause an exponential increase in
phase space. Finally, note that some systems actually have negative temperature, and adding
more heat to them actually makes the temperature even more negative; so adding heat actually
decreases the phase space. So the intuitive answer in the previous paragraph shouldn't be
taken too seriously.
EDIT 2: Moreover, this answer explains why there needs to be an additional factor of
temperature along with heat in the definition of entropy so as to be consistent with the second
This is why heat flows from hot objects to cold objects: the entropy change of the hot object is
negative, while that of the cold object is positive. The magnitude of entropy change for the cold
object is greater. ΔS=QTc−QTh>0ΔS=QTc−QTh>0 as Th>TcTh>Tc
Keep in mind that entropy increases with temperature. This can be understood intuitively in
the classical picture, as you mention. However, at higher temperatures, a certain amount of
heat added to the system causes a smaller change in entropy than the same amount of heat at a
lower temperature.

The formula is ΔS=QTΔS=QT. The change in entropy is related to heat. Remember that heat is a
form of energy transfer, not energy; we can talk about heat only when some change takes place.
Please do point out if there is anything wrong with the answers above.
thermodynamics temperature entropy

shareciteimprove this question

edited Feb 10 '15 at 17:04

asked Dec 15 '14 at 17:11
show 1 more comment
8 Answers

up vote8down vote
The formula is actually better written

That is, the change in entropy associated with the flow of heat is inversely proportional to the
temperature at which the heat flow occurs. Note that QQ is already a change itself: it is not a
state variable, but rather something more like ΔWΔW. Physically, this is because adding heat to
a hot system doesn't disorder it as much as adding the same heat to a cold system.
If TT is changing, you would need to integrate
For an ideal gas at constant volume and particle number, it turns out

Thus doubling the temperature does lead to the exponential of SS doubling, as you would
shareciteimprove this answer
answered Dec 15 '14 at 17:26
add a comment
up vote7down vote
You asked for intuitive sense and I'll try to provide it. The formula is:

So, you can have ΔS1=ΔQTlowerΔS1=ΔQTlower and ΔS2=ΔQThigherΔS2=ΔQThigher
Assume the ΔQΔQ is the same in each case. The denominator controls the "largeness" of
the ΔSΔS.
Therefore, ΔS1>ΔS2ΔS1>ΔS2
In each case, let's say you had X number of hydrogen atoms in each container. The only
difference was the temperature of the atoms. The lower temperature group is at a less frenzied
state than the higher temperature group. If you increase the "frenziedness" of each group by
the same amount, the less frenzied group will notice the difference more easily than the more
frenzied group.
Turning a calm crowd into a riot will be much more noticeable than turning a riot into a more
frenzied riot. Try to think of the change in entropy as the noticeably of changes in riotous

shareciteimprove this answer

edited Feb 8 '15 at 17:49
answered Feb 8 '15 at 17:34

add a comment
up vote3down vote
Heat added to a system at a lower temperature causes higher entropy increase than heat added
to the same system at a higher temperature.

How does this make intuitive sense?

The formula defines entropy change. Since it defines new word not conceived before and thus
devoid of sense, it is hard to imagine as having "intuitive sense".
If your question really is why people use this definition and not other, here is one possible view:
Carnot came to conclusion that all reversible cycles operating between two
temperatures T1<T2T1<T2(acquiring heat Q1Q1 and Q2Q2) have the same efficiency (work
divided by heat consumed)
Today, this is being derived from 1st and 2nd law of thermodynamics in most textbooks on
thermodynamics. It is done most easily for ideal gas, but the result is generally valid for any
substance. It is at this point where the division by temperatures enters the discussion.

Since ∑W=Q1+Q2∑W=Q1+Q2, it follows

(sum of reduced heats equals zero).

General cyclic process has the same effect on the surroundings of the system as many Carnot
cycles tesselating the original cycle in the work diagram, each operating with very small
amount of heat.

Writing Carnot's equation for all of them and summing the terms, adjacent terms cancel each
other and we are left with sum over terms that correspond to boundary elements of the curve
representing the general cycle only:

with both isothermic and adiabatic elements ss.
We pass from this sum to loop integral in the thermodynamic space of states XX:
where JJ is such function of XX and ΓΓ that integral over segment of ΓΓ (let's call
it ΔΓΔΓ) ∫ΔΓJ⋅dX∫ΔΓJ⋅dX is the heat accepted by the system when it changes state along the
curve ΔΓΔΓ.
The last equation can be expressed in words this way: the line integral of J/TJ/T along closed
curve ΓΓin space of thermodynamic equilibrium states XX is always zero.
It follows that the integral

(equal to heat accepted by the system) depends on the path chosen to connect thermodynamic
equilibrium states XiXi and XfXf, but the integral
does not depend on it; it only depends on those two states. This enables us to define function in
the space of equilibrium states

where S(Xi)S(Xi) is value of the function for reference state XiXi, chosen by convention. It does
not matter which path is chosen to connect XiXi and XfXf; value of SS only depends on the
endpoint XfXf.
This function is called entropy. Unfortunately, there is nothing intuitive about it in
thermodynamics; it's just a useful definition.

shareciteimprove this answer

edited Feb 8 '15 at 14:10
answered Feb 8 '15 at 11:23

Ján Lalinský
add a comment
up vote2down vote
I find that the question here relates directly to the definition of temperature, and I'll give a
short version of it. For simplicity let us consider a system generated by two sub-systems, A and
B, in thermal contact (meaning they only exchange energy in the form of heat).

Let me state that, for AA and BB in thermal equilibrium, the entropy SASA and SBSB for each
respective sub-system depend upon the energy in precisely the same way. This statement only
reflects the thermal equilibrium condition. Further, consider this statement in more
mathematical terms
that is, that when an infinitesimal energy quantity dEdE is traded between sub-
system AA and BB such that SASA changes, then the corresponding change in SBSB exactly
compensates for this. The observation to be made is that the quantity dS/dEdS/dE gives us a
measure of how benign a system is to accept a change in energy by thermal means. From this
we define temperature as
and we can see that for temperatures T1<T2T1<T2, the change in entropy is greater when
trading energy at T1T1 as compared to T2T2.
Edit: The intuition is in understanding the definition. Although, I will admit that it is still rather
abstract, since it contain two abstract concepts like energy and entropy. This may be stated
somewhat differently if we look more closely at the definition of entropy. For a given system,
the entropy is defined as the logarithm of the number of accessible microstates gg at some
energy EE
that is, g(E)g(E) counts the number of accessible microstates for a given energy E. This gg may
depend on several factors of course (volume for instance). Considering the definition again and
not talking explicitly in terms of entropy; let us give a system some energy δEδE through
thermal contact and we realize that for a low system temperature the number of accessible
quantum states g(E)g(E) increase to a greater extent than compared to when the system has a
higher temperature. As such, the intuition is that we ''unlock'' more microstates for a system at
low temperature than at a higher temperature with some quantity of energy δEδE. This is
relatively speaking, since at low temperatures the system has a low energy and thus few
accessible microstates to be in, such that an energy δEδE for the system will increase the
number of accessible states to a greater extent than when at high temperatures, that is, the
impact of δEδE it on g(E)g(E) depends on the number of new states it makes available
compared to the number which was already available.
shareciteimprove this answer
edited Feb 10 '15 at 9:22
answered Feb 8 '15 at 12:54

add a comment
up vote2down vote
The question is based on the reasoning that "S rises when Q is added, and T rises when Q is
added, so S must rise if T rises." The problem is that T is kinetic energy per particle while Q is
internal energy added to the entire system minus the work done. T is an intensive property
while S and Q are extensive properties. It's true that for every particle in a given system S ~
ln(T) + c and that S ~ ln(Q) + c (which I show below). I think the core problem is that the
question mixes system-wide Q with a per particle T.
To be precise in explaining how T and Q are different when considering an entire system: to
raise T without raising Q you would have to reduce the number of particles carrying the
energy. The energy from the removed particles would be added to the remaining ones. This
would result in lower entropy because although the number of states per particle is increased,
the number of particles is decreased (the reduced N dominates in gases: S ~ N [a ln(T) + b - c
ln(N) ]. But to raise Q without raising T you would have to add particles which raises entropy.

BTW, the original formula should have dQ instead of Q so that you should have dS=dQ/T. But in
either case, it is obvious from the equation that at a higher T there will be less of an dS increase
for a given amount of Q added.

In the following I'll detail the relationship between Q and S, and then T and S, for a given
system, showing they have the same effect on S for a given fixed system but only if you do not
mix T and Q. "~ "will mean proportional, not approximately. My statements should be exact
exact, but a lot is hidden in the proportionalities because it takes advantage of log(x^c) = c

In looking at the Sackur-Tetrode equation for an ideal gas of N particles in a given box I see:
S ~ ln(states per particle) + constant
note: the "states per particle" does not change in a simple way if N changes.
states per particle ~ momentum p per particle
momentum per particle ~ SQRT(2m (K.E.) per particle)
K.E. per particle ~ Temperature, internal energy, and heat per particle (above absolute zero)
but approximating an unrealistic ideal that heat capacity is not changing with temperature.
ln(x^0.5) ~ ln(x)

So, for a given gas in a given box

S ~ ln(T) + constant
S ~ ln(Q) + constant

So here's what's happening at a T and a 2 x T:

S ~ ln(Q0 + Q1) + constant
S ~ ln(2Q0 + Q1) + constant
Example: Q0 = Q1 = 2
low T has fractional S increase with Q1 added of (1.38+c)/(0.69+c)
2x T has fractional S increase with Q1 added of (1.79+c)/(1.38+c)
I could replace the Q's with T's, so the original question has some reason, but I can't mix and
match the T and Q like the question does.

shareciteimprove this answer

edited May 1 '17 at 23:01
answered Jul 25 '15 at 2:49

add a comment
up vote1down vote
In my statistical mechanics course last year, we derived that S∝1TS∝1T from the following
Consider two boxes (each with NiNi particles, ViVi volume, UiUi internal energy,
and TiTi temperature, where i=1,2i=1,2) separated by a wall through which they can exchange
energy (heat).
Clearly Ωi∝UiΩi∝Ui and Ωi∝U−13−iΩi∝U3−i−1 when the total energy, U0=U1+U2U0=U1+U2, is
fixed, where ΩiΩi is the number of states of system ii.
The first law of thermodynamics says that, at equilibrium, T1=T2T1=T2. The second law says
that, if T1<T2T1<T2, then heat goes from system 2 to system 1, meaning
that U1<Ueq1U1<U1eq, where UeqiUieq is the internal energy of system ii when the two
systems are in equilibrium with each other.
In equilibrium, entropy (S=kblnΩS=kbln⁡Ω) is maximized.
Let's now expand the total entropy of the two systems, STST:
From the above statement about equilibrium maximizing entropy,
Consider that
U0=U1+U2 ⇒ 0=dU1+dU2 ⇒ dU1=−dU2U0=U1+U2 ⇒ 0=dU1+dU2 ⇒ dU1=−dU2
At this point, it might seem reasonable to suppose that T=dSdUT=dSdU (which it is not). As a
check, let's expand about U~1<Ueq1U~1<U1eq:
⇒ dSTdU1=dS1dU1∣∣∣U~1−dS2dU2∣∣∣U0−U~1>0⇒ dSTdU1=dS1dU1|U~1−dS2dU2|U0−U~1>0
But this implies that T1−T2>0T1−T2>0, and that heat is flowing from cold to hot, in violation of
the second law. To make our definition of temperature mesh with the second law, we therefore
1T=dSdU ⇒ dU=TdS1T=dSdU ⇒ dU=TdS
shareciteimprove this answer
edited Feb 8 '15 at 19:42
answered Feb 8 '15 at 19:24

add a comment
up vote1down vote
Entropy is defined in my book as ΔS=Q/T.
Already, this is not correct. In general,

In the specific case where the system being heated is at always in thermal equilibrium, then

So, clearly, δS=δQ/TδS=δQ/T can not be taken as the definition of entropy.
The definition of entropy is [See Landau and Lifshitz "Statistical Physics" (2nd edition)
Equation 7.7]:

where ΔΓΔΓ is the statistical weight of the subsystem, which is defined as follows:
If wn=w(En)wn=w(En) is the probability of finding the system in quantum
state nn then w(E¯)ΔΓ≡1w(E¯)ΔΓ≡1, where E¯E¯ is the most probable energy of the system
(which is the same as the macroscopic internal energy of thermodynamics since fluctuations
are negligible). To put it another way, ΔΓΔΓ is the number of states
within ΔEΔE of E¯E¯ (where ΔEΔE is defined as: ΔEΔE times the energy-probability-distribution
at E¯E¯ equals 1).
We can rewrite the Entropy more explicitly as a function of E¯E¯ as:
(As an aside, it turns out that the log of ww has to be linear in EE so that
which is another often-encountered definition of entropy.)
But, anyways, since SS is a function of E¯E¯, we can take the derivative of SS with respect
to E¯E¯. This derivative is defined to be the inverse temperature:
The derivative above is taken at fixed system volume, so this says that δS=δQ/TδS=δQ/T... I
can explain that a little bit more below:
One way to change the energy of system is to perform work on the system. Typically in
thermodynamics this happens by compressing or expanding the volume of the system
(although it doesn't have to happen this way). If the system is not thermally isolated there is
another way to change the energy of the system, which is called heat. Heat is a direct transfer of
energy to/from other systems in thermal contact with the system. Heat and work. Work and
heat. This is how we change the energy of the system. I.e.,

If, the system of interest is always in thermal equilibrium throughout the process over which
the energy is changed by heat and work, then from the definition of temperature we know that
I.e., δQ=TδSδQ=TδS holds.
So, anyways, we've arrived at ΔS=Q/TΔS=Q/T by defining TT as
but why define the right hand side as the inverse of the temperature, why not define it as the
inverse of the square of the temperature? One answer is that if you make any other definition
you will not end up with the usual known thermodynamic laws, such as PV=NkTPV=NkT and
whatnot. But does this definition jive with my intuitive sense of temperature?! I'm sure you are
asking... Well, no, it does not. In order to make it jive with your intuitive sense you need
Boltzmann's constant kk and make the substitutions T→kTT→kT and S→S/kS→S/k. So now
does it jive?! Well, that depends on what your intuitive sense of temperature was to begin with.
If you think about this you will probably realize that we humans probably start off with an
intuitive sense of hot and cold and the sense that heat flows from hot to cold (but not an
intuitive sense of temperature). Later on we associate temperature with the expansion of
mercury in thermometers and whatnot. And the above definition of temperature appropriately
describes the expansion of mercury in thermometers just as it appropriately describes the ideal
gas law and so on... So, yes, this definition does jive with that simple intuitive sense of
temperature, only it is much sharper and more useful.
P.S. Most of the equations use the notation from Landau and Lifshitz. For more info along these
lines read chapters one and two of their "Statistical Physics".

shareciteimprove this answer

edited Feb 12 '15 at 5:54
answered Feb 12 '15 at 5:41

add a comment
up vote0down vote
A simple explanation: We know that heat travels from a body or system which has higher
temperature to a colder body or system which has lower temperature. Also the entropy is
directly proportional to heat. Therefore the entropy decreases in the former body (higher
temperature body) and increases in the latter body (lower temperature body). This explains
the following notion, "Heat added to a system at a lower temperature causes higher entropy
increase than heat added to the same system at a higher temperature."

shareciteimprove this answer

answered Jan 14 at 16:41
Neil Paliwal
add a comment
Your Answer

Sign up or log in
Sign up using Google
Sign up using Facebook
Sign up using Email and Password

Post as a guest


By posting your answer, you agree to the privacy policy and terms of service.
Not the answer you're looking for? Browse other questions
tagged thermodynamics temperature entropy or ask your own question.
3 years, 3 months ago
15,170 times
2 months ago

6 votes · comment · stats


Why is the change in entropy greater for processes occurring at lower temperatures?

How is dQTdQT measure of randomness of system?


Does negative temperature in Carnot cycle yield a counterexample of the second law of thermodynamics?

How does the entropy of an isolated system increase?

Why can the entropy of an isolated system increase?

Why the amount of entropy increase to a system is less when heat is added to a higher temperature system than to a
lower one?

Why does heat added to a system cause an increase in entropy that is independent of the amount of particles in the
How is energy related to entropy?

Can work on a system increase entropy?

how to increase entropy without increasing temperature?

Does negative absolute temperature imply heat engine efficiency greater than one?

Final equilibrium temperature of two reservoirs connected together via a heat engine (thermal efficiency of 50%)

Hot Network Questions

 Why does XKCD say that autogyros will crash if the pilot reacts to a stall as in a normal airplane?
 A Traveling Riley
 What chord would this most logically be thought as?
 How much can a raven carry during flight?
 What is the term for the fallacy/strategy of ignoring logical reasoning intended to disprove a belief?
 Unable to tar a .tgz file
 How do you detect dependency problems with unit tests when you use mock objects?
 How can I use Roll20 to reroll some dice until I get no duplicates?
 How to keep the mood of a song when you've practiced it a lot
 Why are aircraft coloured in yellow during construction, repair or overhaul?
 Why was President Trump congratulating Putin on winning the election heavily criticized?
 How can PCs estimate the power of opponents?
 Can any time on clock be spoken as it is in numbers only (hour + minutes)?
 How do you assess the value of an individual scene?
 In which direction water flows
 Remove ✅, 🔥, ✈ , ♛ and other such emojis/images/signs from Java string
 How do I clarify that the gift someone gave me is mine when they are asking for it back?
 What are you eating?
 How to prevent non-technical and disruptive persons from coming to our technical meetings?
 Libertinus Math: (too) small math symbols?
 Does knight behave like a king in his infinite odyssey?
 In my native language, we've this obscene saying - don't take a dump in the barrel of honey
 What are the actions one can do against players who stop moving when they are in an objectively losing
position and refuse to resign?
 How to stop people from looking at my mouth when we're having a conversation?

question feed
 Tour
 Help
 Chat
 Contact
 Feedback
 Mobile
 Stack Overflow
 Stack Overflow Business
 Developer Jobs
 About
 Press
 Legal
 Privacy Policy
 Technology
 Life / Arts
 Culture / Recreation
 Science
 Other
 Blog
 Facebook
 Twitter
 LinkedIn

site design / logo © 2018 Stack Exchange Inc; user contributions licensed under cc by-sa
3.0 with attribution required.rev 2018.3.30.29681
Can I help?