It is generally a limitation of most cooling systems that the cooling fluid not be allowed to boil, as the need to handle gas in the flow greatly complicates design. For a water cooled system, this means that the maximum amount of heat transfer is limited by the specific heat capacity of water and the difference in temperature between ambient and 100°C. This provides more effective cooling in the winter, or at higher altitudes where the temperatures are low.
Another effect that is especially important in aircraft cooling is that the specific heat capacity changes with pressure, and this pressure changes more rapidly with altitude than the drop in temperature. Thus, generally, liquid cooling systems lose capacity as the aircraft climbs. This was a major limit on performance during the 1930s when the introduction of turbosuperchargers first allowed convenient travel at altitudes above 15,000 ft, and cooling design became a major area of research.
The most obvious, and common, solution to this problem was to run the entire cooling system under pressurization. This maintained the specific heat capacity at a constant value, while the outside air temperature continued to drop. Such systems thus improved cooling capability as they climbed. For most uses, this solved the problem of cooling high-performance piston engines, and almost all liquid-cooled aircraft engines of the World War II period used this solution.
However, pressurized systems were also more complex, and far more susceptible to damage - as the cooling fluid was under pressure, even minor damage in the cooling system like a single rifle-calibre bullet hole, would cause the liquid to rapidly spray out of the hole. Failures of the cooling systems were, by far, the leading cause of engine failures.