Maybe I’m totally misunderstanding terminology, but I had read that Li-On batteries (in Dell laptops, for instance, which I have,) should not exceed 104 F, lest they run the risk of fire or venting gas. Yet I had read that laptop processors/CPUs regularly reach 175 F or hotter without issues.
Granted, the CPU is located some distance away from the Li-On battery, but at that level of heat, I would expect that the heat would find someway to heat up the battery indirectly to the point where it would be hotter than 104 F. Does the 104 F mean when the laptop is turned off, or when it is turned on?
That 175 F is the die temperature. So the actual silicon the CPU is built out of. Most modern CPUs have some self monitoring capability built into their design, and this is what is reporting the temperature. The surface of package containing the IC temperature will be lower, and the cooling infrastructure substantially lower still.
In the end what matters is the power input to the whole device and the ability to reject the heat. Laptops tend to include reasonably sophisticated thermal management, especially for the package size. The processor, and GPU if there is one, plus ancillary integrated circuits will be cooled with heat pipes and fan assist when needed.
This tries to keep the inside cool. If the CPU die temperature gets much hotter then 175 F, the CPU will tend to self limit and keep itself from dying. The rest of the cooling system keeps the laptop temperature down to something sensible.
Put the laptop in a vacuum with thermal management disabled and, given enough time, it would get close to thermal equilibrium with the CPU die. It would not get all the way there as it would also be radiating heat away. But it wouldn’t be happy.
I missed the edit window, but it occurred to me to add that the power density on the surface of CPUs and GPUs is nothing short of insane. The actual layer of stuff that does the work is only microns thick. And a few millimeters to teens of millimeters on each edge. If it were a cube you would have trouble seeing it. Yet we slam 10’s of watts for low power computers like laptops, through to low hundreds of watts for high end devices. It is a major achievement stopping these devices from self immolation in the first few milliseconds of use.
All the facts have been said, but to draw analogies: An electric stove eye gets to 1000 deg F, yet the bulk of the stove is harmless to touch. A candle flame can be 2000 deg F, yet you can pick the candle up without worry.
Any heat generated in a device has to go somewhere, but part of that “somewhere” is out to the environment, and that’s why the whole stove or candle or laptop doesn’t get as hot as the hottest piece. The heat from chemical reactions in a candle, electrical heating of a stove element, or electrical heating of a CPU enters at one “point” but leaves all sorts of ways. In the CPU situation, the goal is to make sure that the generated heat flows most efficiently to where it can be lost to the environment most efficiently. But the analogies are meant to (hopefully) make it not such a crazy thing to think that the battery isn’t going to get cooked by the hot CPU. (The CPU might cook itself, though, and that’s the actual hard part of the thermal management problem in a laptop, when you can’t clamp something like this to the chip like you can is a desktop PC.)