I happened across a thermodynamics problem which tasks the reader with heating up a volume of 20°C water using the same volume of 80°C water, such that the final temperature of the once-cold water exceeds that of the once-hot water. The only implements available are three insulated bottles and an empty container that fits into any of the insulated bottles and has thermally conducting walls. You're not allowed to mix the hot and cold water.
Visually, the setup can be illustrated like this:
If you'd like to work it out for yourself, stop here.
The solution is to heat half the cold water at a time. First put half the cold water in the container that can be inserted into an insulated bottle:
Then insert it into the hot water bottle:
With its thermally conducting walls, the water temperatures will equalize. Since there's twice as much 80°C water as 20°C water, the temperature of the cold water will increase by twice as much as the hot water cools. Stated mathematically, 80°C - ΔT = 20°C + 2ΔT, meaning ΔT is 20°C and the water temperatures will equalize at 60°C.
Store the warmed water in the empty bottle:
Repeat the immersion with the still-20°C water:
Now, with twice as much 60°C water as 20°C water, temperatures will equalize at 60°C - ΔT = 20°C + 2ΔT, meaning ΔT is 13⅓°C and equilibrium is 46⅔°C.
Mixing equal amounts 60°C water and 46⅔°C water averages both to 53⅓°C:
And the once-20°C water is now warmer than the once-80°C water.
This works (in an ideal system), but it feels paradoxical. Using nothing but equalizing heat transfer, we've moved 33⅓°C between containers that only differed by 60°C, meaning we've moved more than half the difference just by dividing the transfer into two steps. Note, though, that the total temperature of the system has not changed: 80°C + 20°C = 100°C = 46⅔°C + 53⅓°C.
In the next post I'll model these steps with code so we can try a few more scenarios.