Sizzler, when you state that smoldering fires must be cooler, you are repeating a common misunderstanding, which ultimately comes from failing to clearly differentiate between heat and temperature.
Hydrocarbon combustion releases chemical energy in the form of heat. When heat is added to matter, the temperature goes up. More heat means higher temperature, without limits. If you have hydrogen and oxygen separately preheated to, say, 2000 degrees, and you mix them and allow them to combust, the reaction adds heat and the reaction products will get hotter than 2000 degrees.
The reason fires don't normally keep heating up to thousands of degrees has nothing to do with any inherent limits on energy conversion by combustion. (Such limits do exist, but at much higher temperatures, where chemical bonds cannot form at all.) It's because of heat dissipation.
Consider what happens if you have a steady release of heat in a localized environment -- say, a gas burner flame under an empty pan. The general rule is that the hotter something is compared to its surroundings, the faster heat flows from it to its surroundings. So, in the case of our heated pan, at first the temperature rises, because when the temperature is only slightly above the temperature of the surroundings, transfer of heat to the surroundings is much slower than the rate heat is released by the burner. As the temperature of the pan rises, it becomes hotter than its surroundings by a larger amount, so it loses heat faster. But the temperature will continue to rise as long as the rate heat is lost to the surroundings is less than the rate heat is released by the flame. The temperature will rise until it reaches a point where the rate of heat loss to the surroundings is the same as the rate of heat release by the burner, an equilibrium temperature. Whether the pan will get hot enough to melt depends on what temperature that equilibrium occurs, which in turn depends on both the rate of heat release, and the rate of heat loss to the surroundings. So, how hot the pan gets doesn't only depend on how high the gas flame is turned to. It also depends on the size, shape, and materials of the pan, and the configuration and materials of the immediate surroundings.
Similarly for an uncontrolled fire, if you change either the combustion rate or any of the many factors affecting the heat dissipation rate, you change the "maximim temperature" (which you can now see is not a maximum at all, but an equilibrium temperature) that a fire can heat the surrounding materials to.
Right away, this means that any blanket statement of the "maximum temperature" a given kind of fuel can reach is bunk. The best that can be said is that certain typical fire situations, reach certain typical maximum temperatures. Change any parameter -- fuel, ventilation, surrounding materials, shape, or size of the scenario -- and the equilibrium temperature can change radically. If any parameter is not typical, do not expect typical temperatures.
How do specific factors affect the rate of heat loss change? Let's look in more detail at the physical mechanisms by which heat moves: conduction (heat transfer by contact between molecules), convection (heat transfer by movement of heated fluid), and radiation (heat transfer by emission and absorbtion of photons). Heat transfer is complex because these three mechanisms each follow different rules in how they are affected by materials, geometry, and the magnitude of the temperature difference. For example, heat cannot pass across a cross a vacuum by conduction or convection, but it can by radiation. Opaque solid materials can block radiation, but many of them permit rapid conduction.
Note that all of these mechanisms can only transfer heat from the surface of a heated mass. (That's a bit of an oversimplification for radiation; for radiation it doesn't apply to transparent materials, but transparent materials are also poor radiators so it's still basically true.) As you scale up a heated mass of a given shape in three dimensions, the ratio of the amount of heat to the rate of heat loss to the surroundings must increase. That's one reason bigger fires can get hotter.
Conduction of heat through a solid depends linearly on the gradient of the temperature difference. Twice as much temperature difference means twice as much heat flows. Heat transfer rates by radiation depend on the differences of the fourth powers of the temperatures (in absolute units). That means radiation is a minor factor at low temperature differences, but increases much faster so that at high temperatures it becomes the dominant factor. Convection is in between those two, it increases faster than linearly with increasing temperature, but not as fast as radiation.
This means that the hotter a fire is, the more radiation and convection become the dominant processes of heat dissipation. Many fires grow in two dimensions, horizontally, rather than three. In such cases the ground below the fire and the air above it are poor conductors of heat. So conduction only acts significantly at the edges of the fire (which only increases as the square root of the fire's area, and so decreases in proportion to the fire's size), while convection and radiation apply throughout the area. More generally, heat loss from conduction is only important in fire on a very small scale -- it's the main reason why it's difficult to ignite a large wood log with a match, but it won't make any difference if you throw the same log onto a bonfire.
Now, what happens when there is a large fire, burning underground in a coal seam or the huge rubble piles at Ground Zero? You looked at the rate of heat release, which is limited by the limited ventilation, and concluded that such fires can only reach lower temperatures. But to find an equilibrium you must consider both sides of the equation, in this case heat release rate and heat dissipation rate. Look more closely at what happens to the heat in an underground fire. Heat transfer by radiation is for all practical purposes eliminated. And heat transfer by convection is greatly reduced; only the relatively small smoke plumes are removing heat from the mass. That leaves conduction, which is a far slower mechanism of heat transfer to begin with, especially for large-volume fires, and the piles contain layers of crushed concrete and are contained in concrete and earth, which are excellent insulators. The mechanisms that most effectively remove the most heat from the largest masses and at the highest temperatures are the ones that are shut down. Heat dissipation is slowed down by limited convection and radiation, to a larger degree than heat release is slowed down by limited ventiation. So the equilibrium interior temperature at which the heat dissipation rate equals heat release rate increases.
Please, look up underground coal seam fires if you still have difficulty understanding how a smoldering fire undeground can become much hotter than an open fire burning the same fuels. That should at least suggest to you that your incredulity is the result of your not understanding the complex nature of the phenomenon, rather than from the phenomenon not existing.
You might also enjoy considering the following question, using the above facts to guide you: why is the sun so hot? You might want to start by calculating the sun's mean heat release rate per cubic centimeter per minute, given that the sun has a limited supply of fuel that is expected to last over 10 billion years. Cubic centimeter for cubic centimeter, the sun produces heat orders of magnitude slower than any chemical fire, even orders of mangitude slower than the tissues of your own body. So how does the sun stay so hot for billions of years?
Respectfully,
Myriad