The History of Steel

From Iron Age to Electric Arc Furnaces

Steel factory workers in hardhats standing near a large steel tube hanging from an overhead crane.

Buena Vista Images / Getty Images 

The development of steel can be traced back 4000 years to the beginning of the Iron Age. Proving to be harder and stronger than bronze, which had previously been the most widely used metal, iron began to displace bronze in weaponry and tools.

For the following few thousand years, however, the quality of iron produced would depend as much on the ore available as on the production methods.

By the 17th century, iron's properties were well understood, but increasing urbanization in Europe demanded a more versatile structural metal. And by the 19th century, the amount of iron being consumed by expanding railroads provided metallurgists with the financial incentive to find a solution to iron's brittleness and inefficient production processes.

Undoubtedly, though, the most breakthrough in steel history came in 1856 when Henry Bessemer developed an effective way to use oxygen to reduce the carbon content in iron: The modern steel industry was born.

The Era of Iron

At very high temperatures, iron begins to absorb carbon, which lowers the melting point of the metal, resulting in cast iron (2.5 to 4.5% carbon). The development of blast furnaces, first used by the Chinese in the 6th century BC but more widely used in Europe during the Middle Ages, increased the production of cast iron.

Pig iron is molten iron run out of the blast furnaces and cooled in the main channel and adjoining molds. The large, central and adjoining smaller ingots resembled a sow and suckling piglets.

Cast iron is strong but suffers from brittleness due to its carbon content, making it less than ideal for working and shaping. As metallurgists became aware that the high carbon content in iron was central to the problem of brittleness, they experimented with new methods for reducing the carbon content to make iron more workable.

By the late 18th century, ironmakers learned how to transform cast pig iron into a low-carbon content wrought iron using puddling furnaces (developed by Henry Cort in 1784). The furnaces heated molten iron, which had to be stirred by puddlers using long, oar-shaped tools, allowing oxygen to combine with and slowly remove carbon.

As the carbon content decreases, iron's melting point increases, so masses of iron would agglomerate in the furnace. These masses would be removed and worked with a forge hammer by the puddler before being rolled into sheets or rails. By 1860, there were over 3000 puddling furnaces in Britain, but the process remained hindered by its labor and fuel intensiveness.

One of the earliest forms of steel, blister steel, began production in Germany and England in the 17th century and was produced by increasing the carbon content in molten pig iron using a process known as cementation. In this process, bars of wrought iron were layered with powdered charcoal in stone boxes and heated.

After about a week, the iron would absorb the carbon in the charcoal. Repeated heating would distribute carbon more evenly and the result, after cooling, was blister steel. The higher carbon content made blister steel much more workable than pig iron, allowing it to be pressed or rolled.

Blister steel production advanced in the 1740s when English clockmaker Benjamin Huntsman while trying to develop high-quality steel for his clock springs, found that the metal could be melted in clay crucibles and refined with a special flux to remove slag that the cementation process left behind. The result was a crucible, or cast, steel. But due to the cost of production, both blister and cast steel were only ever used in specialty applications.

As a result, cast iron made in puddling furnaces remained the primary structural metal in industrializing Britain during most of the 19th century.

The Bessemer Process and Modern Steelmaking

The growth of railroads during the 19th century in both Europe and America put enormous pressure on the iron industry, which still struggled with inefficient production processes. Steel was still unproven as a structural metal and production of the product was slow and costly. That was until 1856 when Henry Bessemer came up with a more effective way to introduce oxygen into molten iron to reduce the carbon content.

Now known as the Bessemer Process, Bessemer designed a pear-shaped receptacle, referred to as a 'converter' in which iron could be heated while oxygen could be blown through the molten metal. As oxygen passed through the molten metal, it would react with the carbon, releasing carbon dioxide and producing a more pure iron.

The process was fast and inexpensive, removing carbon and silicon from iron in a matter of minutes but suffered from being too successful. Too much carbon was removed, and too much oxygen remained in the final product. Bessemer ultimately had to repay his investors until he could find a method to increase the carbon content and remove the unwanted oxygen.

At about the same time, British metallurgist Robert Mushet acquired and began testing a compound of iron, carbon, and manganese, known as spiegeleisen. Manganese was known to remove oxygen from molten iron and the carbon content in the spiegeleisen, if added in the right quantities, would provide the solution to Bessemer's problems. Bessemer began adding it to his conversion process with great success.

One problem remained. Bessemer had failed to find a way to remove phosphorus, a deleterious impurity that makes steel brittle, from his end product. Consequently, only phosphorus-free ore from Sweden and Wales could be used.

In 1876 Welshman Sidney Gilchrist Thomas came up with the solution by adding a chemically basic flux, limestone, to the Bessemer process. The limestone drew phosphorus from the pig iron into the slag, allowing the unwanted element to be removed.

This innovation meant that, finally, iron ore from anywhere in the world could be used to make steel. Not surprisingly, steel production costs began decreasing significantly. Prices for steel rail dropped more than 80% between 1867 and 1884, as a result of the new steel producing techniques, initiating the growth of the world steel industry.

The Open Hearth Process

In the 1860s, German engineer Karl Wilhelm Siemens further enhanced steel production through his creation of the open-hearth process. The open-hearth process produced steel from pig iron in large shallow furnaces.

The process, using high temperatures to burn off excess carbon and other impurities, relied on heated brick chambers below the hearth. Regenerative furnaces later used exhaust gasses from the furnace to maintain high temperatures in the brick chambers below.

This method allowed for the production of much larger quantities (50-100 metric tons could be produced in one furnace), periodic testing of the molten steel so that it could be made to meet particular specifications and the use of scrap steel as a raw material. Although the process itself was much slower, by 1900, the open-hearth process had primarily replaced the Bessemer process.

Birth of the Steel Industry

The revolution in steel production that provided cheaper, higher quality material, was recognized by many businessmen of the day as an investment opportunity. Capitalists of the late 19th century, including Andrew Carnegie and Charles Schwab, invested and made millions (billions in the case of Carnegie) in the steel industry. Carnegie's US Steel Corporation, founded in 1901, was the first corporation ever launched valued at over one billion dollars.

Electric Arc Furnace Steelmaking

Just after the turn of the century, another development occurred that would have a strong influence on the evolution of steel production. Paul Heroult's electric arc furnace (EAF) was designed to pass an electric current through charged material, resulting in exothermic oxidation and temperatures up to 3272°F (1800°C), more than sufficient to heat steel production.

Initially used for specialty steels, EAFs grew in use and, by World War II, were being used for the manufacturing of steel alloys. The low investment cost involved in setting up EAF mills allowed them to compete with the major US producers like US Steel Corp. and Bethlehem Steel, especially in carbon steels, or long products.

Because EAFs can produce steel from 100% scrap, or cold ferrous, feed, less energy per unit of production is needed. As opposed to basic oxygen hearths, operations can also be stopped and started with a little-associated cost. For these reasons, production via EAFs has been steadily increasing for over 50 years and now accounts for about 33% of global steel production.

Oxygen Steelmaking

The majority of global steel production, about 66%, is now produced in basic oxygen facilities — the development of a method to separate oxygen from nitrogen on an industrial scale in the 1960s allowed for major advances in the development of basic oxygen furnaces.

Basic oxygen furnaces blow oxygen into large quantities of molten iron and scrap steel and can complete a charge much more quickly than open-hearth methods. Large vessels holding up to 350 metric tons of iron can complete conversion to steel in less than one hour.

The cost efficiencies of oxygen steelmaking made open-hearth factories uncompetitive and, following the advent of oxygen steelmaking in the 1960s, open-hearth operations began closing. The last open-hearth facility in the US closed in 1992 and China in 2001.

Format
mla apa chicago
Your Citation
Bell, Terence. "The History of Steel." ThoughtCo, Aug. 28, 2020, thoughtco.com/steel-history-2340172. Bell, Terence. (2020, August 28). The History of Steel. Retrieved from https://www.thoughtco.com/steel-history-2340172 Bell, Terence. "The History of Steel." ThoughtCo. https://www.thoughtco.com/steel-history-2340172 (accessed April 23, 2024).