In a two-phase immersion cooling system, boiling on the spreader surface has been experimentally found to be nonuniform, and it is highly related to the surface temperature and the heat transfer coefficient. An experimentally obtained temperature-dependent boiling heat transfer coefficient has been applied to a numerical model to investigate the spreader's cooling performance. It is found that the surface temperature distribution becomes less uniform with higher input power. But it is more uniform when the thickness is increased. By defining the characteristic temperatures that represent different boiling regimes on the surface, the fraction of the surface area that has reached the critical heat flux has been numerically calculated, showing that increasing the thickness from 1 mm to 6 mm decreases the critical heat flux reached area by 23% at saturation liquid temperatures. Therefore, on the thicker spreader, more of the surface is utilized for nucleate boiling while localized hot regions that lead to surface dry-out are avoided. At a base temperature of 90 °C, the optimal thickness is found to be 4 mm, beyond which no significant improvement in heat removal can be obtained. Lower coolant temperatures can further increase the heat removal; it is reduced from an 18% improvement in the input power for the 1 mm case to only 3% in the 6 mm case for a coolant temperature drop of 24 °C. Therefore, a tradeoff exists between the cost of maintaining the low liquid temperature and the increased heat removal capacity.