7 minute read

Steel

Raw Materials, Manufacturing Processes, Quality Control, Byproducts/waste, The Future



Steel is the most widely used of all metals, with uses ranging from concrete reinforcement in highways and in high-rise buildings to automobiles, aircraft, and vehicles in space. Steel is iron combined or alloyed with other metals or nonmetals such as carbon. Steel is more ductile (able to deform without breakage) and durable than cast iron and is generally forged, rolled, or drawn into various shapes.



Since the beginning of the Iron Age, about 1000 B.C., mankind's progress has been greatly dependent on tools and equipment made with iron. The iron tools were then used to fashion many other much needed goods. Eventually, this was followed by the Industrial Revolution, a period of change beginning in the middle of the eighteenth century in England where extensive mechanization of production systems resulted in a shift from home manufacturing and farms to large-scale factory production. Machine tools and other equipment made of iron and steel significantly changed the economy of both farm and city.

The history of iron and steel began at least 6,000 years ago. It is speculated that early mankind first learned to use iron from fallen meteorites. Many meteorites are composed of iron and nickel, which forms a much harder metal than pure iron. The ancients could make crude tools and weapons by hammering and chipping this metal. Because this useful metal came from the heavens, early human beings probably did not associate it with the iron found in the ground. It is likely that metallic iron was found in the ashes of fires that had been built on outcroppings of red iron ore, also called iron oxide. The red ore was called paint rock, and fires were built against banks of ore that had been exposed to wind and weather. Iron ore is found worldwide on each of the seven continents.

Smelting iron, a primitive direct reduction method of separating iron from its ore using a charcoal forge or furnace, probably began in China and India and then spread westward to the area around the Black Sea. Unlike copper ores, which yielded molten copper in these furnaces, iron would not melt at temperatures below 2,799°F (1,537°C) and the highest temperature that could be reached in these primitive smelters appears to have been about 2,192°F (1,200°C). Iron ore subjected to that temperature does not melt, but instead results in a spongy mass (called "sponge" iron) mixed with impurities called slag. The iron worker removed this spongy mass from the furnace and then squeezed the slag out of it by hammering. This "wrought" iron had less tendency to corrode and had a fibrous quality from the stringers of slag which gave it a certain toughness.

The Hittites, an ancient tribe living in Asia Minor and northern Syria, produced iron starting about 2500 B.C. The Chalybes, a subject tribe of the Hittites, invented a cementation process about 1400 B.C. to make the iron stronger. The iron was hammered and heated in contact with charcoal. The carbon absorbed from the charcoal produced a much harder iron. With the fall of the Hittite empire, the various tribes scattered, carrying the knowledge of smelting and the cementation process with them to Syria, Egypt, and Macedonia. Widespread use of iron for weapons and tools began about 1000 B.C., marking the beginning of the Iron Age.

The ancient Egyptians learned to increase smelting temperature in the furnace by blowing a stream of air into the fire using blowpipes and bellows. Around 500 B.C., the Greek soldiers used iron weapons which had been hardened by quenching the hot metal in cold water. The Romans learned to reheat the iron after quenching in a process called tempering which made the iron less brittle.

During the Middle Ages, from about A.D. 500 to A.D. 1500, the old methods of smelting and cementation continued. Early blacksmiths made chain mail, weapons, nails, horseshoes, and tools such as iron plows. The Stückofen, a furnace first developed by the Romans, was made larger and higher for better air draft. This was a forerunner of the modern blast furnace. Waterwheels came into use for ironmaking between A.D. 1200 and A.D. 1350. The waterwheels converted the energy of swift stream currents into work that moved air bellows, forcing blasts of air into the furnace. The resulting higher temperature melted the iron, which was then formed into "pigs" (so named because as the pig iron was cast, the runners and series of ingots resembled pigs suckling their mother) of cast iron. As time progressed, these early blast furnaces were built larger and better, reaching 30 ft (9 m) in height and able to operate continuously for weeks at a time.

About A.D. 1500, ironmakers faced wood shortages that affected their source of charcoal. Increased warfare and the resulting demand for more iron weapons forced ironmakers to use coal as an alternate source of fuel. A major problem with coal was that it contained impurities such as sulfur and phosphorus that tended to make the iron brittle. In 1709 Abraham Darby of England used "coke," the residue left after soft coal was heated to remove impurities, to successfully smelt pig iron. Crucible cast steel was invented around 1740 by Benjamin Huntsman of England. A clay crucible, or cup, of iron ore was placed in a furnace and when molten, was cast. The resulting cast steel was of very high purity since the molten steel did not come into contact with the fuel. In 1784 another improvement was made by Henry Cort, an English ironmaker, who invented the puddling of molten pig iron. Puddling involved stirring air into the liquid iron by a worker who stood near the furnace door. A reverberatory furnace was used in which the coal was separated from the iron to prevent contamination. After the pig iron had been converted into wrought iron, it was run through a rolling mill which used grooved rollers to press out the remaining slag. Cort's rolling mill was patented in 1783 and could make iron bars about 15 times faster than the old hammer method.

From 1850 to 1865, great advances were made in iron and steel processing. Steel was gaining more popularity than iron beginning around 1860 as less expensive manufacturing methods were discovered and greater quantity and quality were being produced.

William Kelly of the United States, and Henry Bessemer of England, both working independently, discovered the same method for converting iron into steel. They subjected molten pig iron to a blast of air which burned out most of the impurities and the carbon contained in the molten iron acted as its own fuel. Kelly built his first converter in 1851 and received an American patent in 1857. He also went bankrupt the same year and the method finally became known as the Bessemer process. In 1856 Bessemer completed his vertical converter, and in 1860 he patented a tilting converter which could be tilted to receive molten iron from the furnace and also to pour out its load of liquid steel. The Bessemer converter made possible the high tonnage production of steel for ships, railroads, bridges, and large buildings in the mid-nineteenth century. However, the steel was brittle from the many impurities which remained, especially phosphorus and sulfur, and by the oxygen from the air blast. An English metallurgist, Robert F. Mushet, discovered in 1856 that adding an iron alloy (spiegeleisen) containing manganese would remove the oxygen. Around 1875, Sidney G. Thomas and Percy Gilchrist, two English chemists, discovered that by adding limestone to the converter they could remove the phosphorus and most of the sulfur.

In England, another new furnace was introduced in 1861 by two brothers, William and Frederick Siemans. This was the open-hearth furnace, also known as the regenerative open-hearth because the outgoing hot gases were used to preheat the incoming air. Pierre Émile Martin of France improved the process in 1864 by adding scrap steel to the molten iron to speed purification. During this period hardened alloy steels came into commercial use; Mushet made a high carbon steel in 1868 which gave tools longer life in France, a chromium steel alloy was produced in 1877 and a nickel steel alloy in 1888. An Englishman, Sir Robert Hadfield, discovered in 1882 how to harden manganese tool steel by heating it to a high temperature and then quenching it in water.

Around 1879, the electric furnace was developed by William Siemans. This furnace was used very little prior to 1910 because of the high electrical costs and the poor quality of electrodes used to produce the arc for melting.

The open-hearth furnace was the most popular method of steel production until the early 1950s. Pure oxygen became more economical to produce in large quantities and in 1954 the first basic oxygen process facility opened for production in the United States. Today, most of the world's steel is made by either a basic oxygen furnace or an electric furnace.


Additional topics

Science EncyclopediaScience & Philosophy: Spectroscopy to Stoma (pl. stomata)