Abstract
Corrosion of iron-chromium-nickel castings by sulphur-bearing gases at 1800–2000 F has been investigated for the Alloy Casting Institute in a comprehensive series of tests. The sulphur was varied from 0 to 500 grains per 100 cu ft in both reducing and oxidizing flue gases. The effects of cyclic temperature fluctuations and alternately oxidizing and reducing atmospheres were also studied. Both metal loss and subsurface attack were considered in evaluating the alloys. A wide range of composition of the ternary system Fe–Cr–Ni was studied with the chromium varying from 11 to 36 per cent and the nickel from 0 to 70 per cent. Although the behavior of the present commercially important alloys deviated in specific instances, it was shown that, in general, the HE (28 per cent Cr, 9 per cent Ni), HK (26 per cent Cr, 20 per cent Ni), and the HH (26 per cent Cr, 12 per cent Ni) types had the best resistance under a variety of test conditions. The HW (12 per cent Cr, 60 per cent Ni) alloy was generally better than the HT (15 per cent Cr, 35 per cent Ni) alloy and both performed well until the sulphur content of the gas was raised to 100 grains per 100 cu ft. The HX (17 per cent Cr, 66 per cent Ni) and HU (19 per cent Cr, 39 per cent Ni) alloys were generally better than the HW and HT types; however, they too were inferior to the HE, HK, and HH types when sulphur contents were 100 grains per 100 cu ft or higher. The HF (20 per cent Cr, 10 per cent Ni) alloy originally designed for temperatures below 1600 F performed reasonably well in the tests, and at 1800 F in reducing atmosphere containing at least 100 grains of sulphur per 100 cu ft, the alloy was slightly superior to the HW and HT types. In general, corrosion in the higher-sulphur-bearing atmospheres was much less severe when the flue gas was oxidizing than when reducing.