Everipedia Logo
Everipedia is now IQ.wiki - Join the IQ Brainlist and our Discord for early access to editing on the new platform and to participate in the beta testing.
Information Age

Information Age

The Information Age (also known as the Computer Age, Digital Age, or New Media Age) is a historic period in the 21st century characterized by the rapid shift from traditional industry that the Industrial Revolution brought through industrialization, to an economy based on information technology. The onset of the Information Age can be associated with the invention of the transistor,[1] particularly the MOSFET (metal-oxide-semiconductor field-effect transistor),[2][3] which revolutionized modern technology[1] and became the fundamental building block of digital electronics in the information age.[2][3]

According to the United Nations Public Administration Network, the Information Age formed by capitalizing on computer microminiaturization advances.[4] This evolution of technology in daily life and social organization has led to the modernization of information and communication processes becoming the driving force of social evolution.[5]****

Progression

Library expansion

Library expansion was calculated in 1945 by Fremont Rider to double in capacity every 16 years if sufficient space were made available.[6] He advocated replacing bulky, decaying printed works with miniaturized microform analog photographs, which could be duplicated on-demand for library patrons or other institutions. He did not foresee the digital technology that would follow decades later to replace analog microform with digital imaging, storage, and transmission media. Automated, potentially lossless digital technologies allowed vast increases in the rapidity of information growth. Moore's law, which was formulated around 1965, calculated that the number of transistors in a dense integrated circuit doubles approximately every two years.[7]

The proliferation of the smaller and less expensive personal computers and improvements in computing power by the early 1980s resulted in sudden access to and the ability to share and store information for increasing numbers of workers.

Connectivity between computers within companies led to the ability of workers at different levels to access greater amounts of information.

Information storage

The world's technological capacity to store information grew from 2.6 (optimally compressed) exabytes in 1986 to 15.8 in 1993, over 54.5 in 2000, and to 295 (optimally compressed) exabytes in 2007. This is the informational equivalent to less than one 730-MB CD-ROM per person in 1986 (539 MB per person), roughly 4 CD-ROM per person of 1993, 12 CD-ROM per person in the year 2000, and almost 61 CD-ROM per person in 2007.[8] It is estimated that the world's capacity to store information has reached 5 zettabytes in 2014.[9] This is the informational equivalent of 4,500 stacks of printed books from the earth to the sun.

Exponential growth of data storage

The amount of digital data stored appears to be growing approximately exponentially, reminiscent of Moore's law. The amount of storage space available appears to be growing approximately exponentially (Kryder's Law).[10][11][12][13][14]

Information transmission

The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986, 715 (optimally compressed) exabytes in 1993, 1.2 (optimally compressed) zettabytes in 2000, and 1.9 zettabytes in 2007 (this is the information equivalent of 174 newspapers per person per day).[8] The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986, 471 petabytes in 1993, 2.2 (optimally compressed) exabytes in 2000, and 65 (optimally compressed) exabytes in 2007 (this is the information equivalent of 6 newspapers per person per day).[8] In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses and homes globally. Technology was developing so quickly that a computer costing $3000 in 1997 would cost $2000 two years later and $1000 the following year.

Computation

The world's technological capacity to compute information with humanly guided general-purpose computers grew from 3.0 × 108 MIPS in 1986, to 4.4 × 109 MIPS in 1993, 2.9 × 1011 MIPS in 2000 to 6.4 × 1012 MIPS in 2007.[8] An article in the recognized Journal Trends in Ecology and Evolution reports that by now digital technology "has vastly exceeded the cognitive capacity of any single human being and has done so a decade earlier than predicted. In terms of capacity, there are two measures of importance: the number of operations a system can perform and the amount of information that can be stored. The number of synaptic operations per second in a human brain has been estimated to lie between 10^15 and 10^17. While this number is impressive, even in 2007 humanity's general-purpose computers were capable of performing well over 10^18 instructions per second. Estimates suggest that the storage capacity of an individual human brain is about 10^12 bytes. On a per capita basis, this is matched by current digital storage (5x10^21 bytes per 7.2x10^9 people)".[9]

Relation to economics

Eventually, Information and Communication Technology—computers, computerized machinery, fiber optics, communication satellites, the Internet, and other ICT tools—became a significant part of the economy. Microcomputers were developed and many businesses and industries were greatly changed by ICT.

Nicholas Negroponte captured the essence of these changes in his 1995 book, Being Digital.[15] His book discusses similarities and differences between products made of atoms and products made of bits. In essence, a copy of a product made of bits can be made cheaply and quickly, and shipped across the country or internationally quickly and at very low cost.

Impact on jobs and income distribution

The Information Age has affected the workforce in several ways.

It has created a situation in which workers who perform easily automated tasks are forced to find work that is not easily automated.[16] Workers are also being forced to compete in a global job market. Lastly, workers are being replaced by computers that can do their jobs faster and more effectively. This poses problems for workers in industrial societies, which are still to be solved. However, solutions that involve lowering the working time are usually highly resisted.

Jobs traditionally associated with the middle class (assembly line workers, data processors, foremen and supervisors) are beginning to disappear, either through outsourcing or automation. Individuals who lose their jobs must either move up, joining a group of "mind workers" (engineers, doctors, attorneys, teachers, scientists, professors, executives, journalists, consultants), or settle for low-skill, low-wage service jobs.

The "mind workers" are able to compete successfully in the world market and receive (relatively) high wages. Conversely, production workers and service workers in industrialized nations are unable to compete with workers in developing countries and either lose their jobs through outsourcing or are forced to accept wage cuts.[17] In addition, the internet makes it possible for workers in developing countries to provide in-person services and compete directly with their counterparts in other nations.

This has had several major consequences, including increased opportunity in developing countries and the globalization of the workforce.

Workers in developing countries have a competitive advantage that translates into increased opportunities and higher wages.[18] The full impact on the workforce in developing countries is complex and has downsides. (see discussion in section on Globalization).

In the past, the economic fate of workers was tied to the fate of national economies. For example, workers in the United States were once well paid in comparison to the workers in other countries. With the advent of the Information Age and improvements in communication, this is no longer the case. Because workers are forced to compete in a global job market, wages are less dependent on the success or failure of individual economies.[17]

Automation, productivity and job gain

The Information Age has affected the workforce in that automation and computerisation have resulted in higher productivity coupled with net job loss in manufacture.

In the United States for example, from January 1972 to August 2010, the number of people employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing value rose 270%.[19]

Although it initially appeared that job loss in the industrial sector might be partially offset by the rapid growth of jobs in the IT sector, the recession of March 2001 foreshadowed a sharp drop in the number of jobs in the IT sector. This pattern of decrease in jobs continued until 2003.[20]

Data has shown that overall, technology creates more jobs than it destroys even in the short run.[21]

Rise of information-intensive industry

Industry is becoming more information-intensive and less labor and capital-intensive (see Information industry). This trend has important implications for the workforce; workers are becoming increasingly productive as the value of their labor decreases. However, there are also important implications for capitalism itself; not only is the value of labor decreased, the value of capital is also diminished. In the classical model, investments in human capital and financial capital are important predictors of the performance of a new venture.[22] However, as demonstrated by Mark Zuckerberg and Facebook, it now seems possible for a group of relatively inexperienced people with limited capital to succeed on a large scale.[23]

Innovations

The Information Age was enabled by technology developed in the Digital Revolution, which was itself enabled by building on the developments in the Technological Revolution.

Transistors

The onset of the Information Age can be associated with the invention of the transistor.[1] The concept of a field-effect transistor was first theorized by Julius Edgar Lilienfeld in 1925.[24] The first practical transistor was the point-contact transistor, invented by the engineers William Shockley, Walter Houser Brattain and John Bardeen in 1947. This was a breakthrough that laid the foundations for modern technology.[1] Shockley's research team also invented the bipolar junction transistor in 1952.[25][24] However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialised applications.[26]

The MOSFET (metal-oxide-silicon field-effect transistor), also known as the MOS transistor, was invented by Mohamed Atalla and Dawon Kahng in 1959.[27][28][25][3] The MOSFET was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[26] With its high scalability,[29] and much lower power consumption and higher density than bipolar junction transistors,[30] the MOSFET made it possible to build high-density integrated circuits (ICs),[25] allowing the integration of more than 10,000 transistors in a small IC,[31] and later billions of transistors in a single device.[32]

The widespread adoption of MOSFETs revolutionized the electronics industry,[33] including control systems and computers since the 1970s,[34] and later made possible digital communications technology such as smartphones.[32] As of 2013, billions of MOS transistors are manufactured every day.[25] The MOS transistor has been the fundamental building block of digital electronics since the late 20th century, paving the way for the digital age.[3] The MOS transistor is credited with transforming society around the world,[32][3] and has been described as the "workhorse" of the Information Age,[2] as the building block of every microprocessor, memory chip and telecommunication circuit in use as of 2016.[35]

Computers

Before the advent of electronics, mechanical computers, like the Analytical Engine in 1837, were designed to provide routine mathematical calculation and simple decision-making capabilities. Military needs during World War II drove development of the first electronic computers, based on vacuum tubes, including the Z3, the Atanasoff–Berry Computer, Colossus computer, and ENIAC.

The invention of the transistor enabled the era of mainframe computers (1950s – 1970s), typified by the IBM 360. These large, room-sized computers provided data calculation and manipulation that was much faster than humanly possible, but were expensive to buy and maintain, so were initially limited to a few scientific institutions, large corporations, and government agencies.

The germanium integrated circuit (IC) was invented by Jack Kilby at Texas Instruments in 1958.[36] The silicon integrated circuit was then invented in 1959 by Robert Noyce at Fairchild Semiconductor, using the planar process developed by Jean Hoerni, who was in turn building on Mohamed Atalla's silicon surface passivation method developed at Bell Labs in 1957.[28][38] Following the invention of the MOS transistor by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959,[27] the MOS integrated circuit was developed by Fred Heiman and Steven Hofstein at RCA in 1962.[39] The silicon-gate MOS IC was later developed by Federico Faggin at Fairchild Semiconductor in 1968.[40] With the advent of the MOS transistor and the MOS IC, transistor technology rapidly improved, and the ratio of computing power to size increased dramatically, giving direct access to computers to ever smaller groups of people.

The MOS integrated circuit led to the invention of the microprocessor. The first commercial single-chip microprocessor launched in 1971, the Intel 4004, which was developed by Federico Faggin using his silicon-gate MOS IC technology, along with Marcian Hoff, Masatoshi Shima and Stan Mazor.[41][35]

Along with electronic arcade machines and home video game consoles in the 1970s, the development of personal computers like the Commodore PET and Apple II (both in 1977) gave individuals access to the computer. But data sharing between individual computers was either non-existent or largely manual, at first using punched cards and magnetic tape, and later floppy disks.

Data

The first developments for storing data were initially based on photographs, starting with microphotography in 1851 and then microform in the 1920s, with the ability to store documents on film, making them much more compact. In the 1970s, electronic paper allowed digital information to appear as paper documents.

Early information theory and Hamming codes were developed about 1950, but awaited technical innovations in data transmission and storage to be put to full use.

Magnetic-core memory was developed from the research of Frederick W. Viehe in 1947 and An Wang at Harvard University in 1949.[43] The first commercial hard disk drive, the IBM 350, was shipped in 1956.[44] With the advent of the MOSFET, MOS semiconductor memory was developed by John Schmidt at Fairchild Semiconductor in 1964.[45][46] In 1967, Dawon Kahng and Simon Sze at Bell Labs developed the floating-gate MOSFET (FGMOS), which they proposed could be used for reprogrammable ROM (read-only memory),[47] providing the basis for non-volatile memory (NVM) technologies such as flash memory.[48] Following the invention of flash memory by Fujio Masuoka at Toshiba in 1980,[49][50] Toshiba commercialized NAND flash memory in 1987.[51][47]

While cables transmitting digital data connected computer terminals and peripherals to mainframes were common, and special message-sharing systems leading to email were first developed in the 1960s, independent computer-to-computer networking began with ARPANET in 1969. This expanded to become the Internet (coined in 1974), and then the World Wide Web in 1989.

Public digital data transmission first utilized existing phone lines using dial-up, starting in the 1950s, and this was the mainstay of the Internet until broadband in the 2000s. The introduction of wireless networking in the 1990s combined with the proliferation of communications satellites in the 2000s allowed for public digital transmission without the need for cables. This technology led to digital television, GPS, and satellite radio through the 1990s and 2000s.

Computers continued to become smaller and more powerful, to the point where they could be carried.

In the 1980s and 1990s, laptops were developed as a form of portable computers, and PDAs could be used while standing or walking. Pagers existing since the 1950s, were largely replaced by mobile phones beginning in the late 1990s, providing mobile networking features to some computers. Now commonplace, this technology is extended to digital cameras and other wearable devices. Starting in the late 1990s, tablets and then smartphones combined and extended these abilities of computing, mobility, and information sharing.

Optics

Optical communication has played an important role in communication networks.[53] Optical communication provided the hardware basis for internet technology, laying the foundations for the Digital Revolution and Information Age.[54]

In 1953, Bram van Heel demonstrated image transmission through bundles of optical fibers with a transparent cladding. The same year, Harold Hopkins and Narinder Singh Kapany at Imperial College succeeded in making image-transmitting bundles with over 10,000 optical fibers, and subsequently achieved image transmission through a 75 cm long bundle which combined several thousand fibers.[55]

While working at Tohoku University, Japanese engineer Jun-ichi Nishizawa proposed fiber-optic communication, the use of optical fibers for optical communication, in 1963.[56] Nishizawa invented other technologies that contributed to the development of optical fiber communications, such as the graded-index optical fiber as a channel for transmitting light from semiconductor lasers.[57][58] He patented the graded-index optical fiber in 1964.[54] The solid-state optical fiber was invented by Nishizawa in 1964.[59]

The three essential elements of optical communication were invented by Jun-ichi Nishizawa: the semiconductor laser (1957) being the light source, the graded-index optical fiber (1964) as the transmission line, and the PIN photodiode (1950) as the optical receiver.[54] Izuo Hayashi's invention of the continuous wave semiconductor laser in 1970 led directly to the light sources in fiber-optic communication, laser printers, barcode readers, and optical disc drives, commercialized by Japanese entrepreneurs,[60] and opening up the field of optical communications.[53]

See also

  • Attention economy

  • Big data

  • Cognitive-cultural economy

  • Computer crime

  • Cyberterrorism

  • Cyberwarfare

  • Datamation - First print magazine dedicated solely to covering information technology.[61]

  • Digital dark age

  • Digital detox

  • Digital divide

  • Digital transformation

  • Digital world

  • Human timeline

  • Imagination age – hypothesized successor of the information age: a period in which creativity and imagination become the primary creators of economic value

  • Indigo Era

  • Information explosion

  • Information revolution

  • Information society

  • Internet governance

  • Netocracy

  • Social Age

  • Technological determinism

  • Zettabyte Era

  • The Hacker Ethic and the Spirit of the Information Age

References

[1]
Citation Link//www.worldcat.org/oclc/43092627Manuel, Castells (1996). The information age : economy, society and culture. Oxford: Blackwell. ISBN 978-0631215943. OCLC 43092627.
Sep 25, 2019, 3:56 AM
[2]
Citation Linkbooks.google.comRaymer, Michael G. (2009). The Silicon Web: Physics for the Internet Age. CRC Press. p. 365. ISBN 9781439803127.
Sep 25, 2019, 3:56 AM
[3]
Citation Linkwww.youtube.com"Triumph of the MOS Transistor". YouTube. Computer History Museum. 6 August 2010. Retrieved 21 July 2019.
Sep 25, 2019, 3:56 AM
[4]
Citation Linkunpan1.un.orgKluver, Randy. "Globalization, Informatization, and Intercultural Communication". United Nations Public Administration Network. Retrieved 18 April 2013.
Sep 25, 2019, 3:56 AM
[5]
Citation Linkcanvas.instructure.comHilbert, M. (2015). Digital Technology and Social Change [Open Online Course at the University of California] (freely available). Retrieved from https://canvas.instructure.com/courses/949415
Sep 25, 2019, 3:56 AM
[6]
Citation Linkopenlibrary.orgRider (1944). The Scholar and the Future of the Research Library. New York City: Hadham Press.
Sep 25, 2019, 3:56 AM
[7]
Citation Linknews.cnet.com"Moore's Law to roll on for another decade". Retrieved 2011-11-27. Moore also affirmed he never said transistor count would double every 18 months, as is commonly said. Initially, he said transistors on a chip would double every year. He then recalibrated it to every two years in 1975. David House, an Intel executive at the time, noted that the changes would cause computer performance to double every 18 months.
Sep 25, 2019, 3:56 AM
[8]
Citation Linkportal.issn.orgHilbert, Martin; López, Priscila (2011). "The World's Technological Capacity to Store, Communicate, and Compute Information". Science. 332 (6025): 60–65. Bibcode:2011Sci...332...60H. doi:10.1126/science.1200970. ISSN 0036-8075. PMID 21310967.
Sep 25, 2019, 3:56 AM
[9]
Citation Link//www.ncbi.nlm.nih.gov/pubmed/26777788Gillings, Michael R.; Hilbert, Martin; Kemp, Darrell J. (2016). "Information in the Biosphere: Biological and Digital Worlds". Trends in Ecology & Evolution. 31 (3): 180–189. doi:10.1016/j.tree.2015.12.013. PMID 26777788.
Sep 25, 2019, 3:56 AM
[10]
Citation Linkinsidebigdata.com"The Exponential Growth of Data". 2017.
Sep 25, 2019, 3:56 AM
[11]
Citation Linkwww.emc.comJohn Gantz and David Reinsel. "The digital universe in 2020: Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East". December 2012.
Sep 25, 2019, 3:56 AM
[12]
Citation Linkwww.eetimes.comLauro Rizzatti. "Digital Data Storage is Undergoing Mind-Boggling Growth". EE Times. 2016.
Sep 25, 2019, 3:56 AM
[13]
Citation Linkwww.signiant.com"The historical growth of data: Why we need a faster transfer solution for large data sets"
Sep 25, 2019, 3:56 AM
[14]
Citation Linkourworldindata.orgMax Roser and Hannah Ritchie. "Technological Progress".
Sep 25, 2019, 3:56 AM
[15]
Citation Linkarchives.obs-us.com"Negroponte's articles". Archives.obs-us.com. 1996-12-30. Retrieved 2012-06-11.
Sep 25, 2019, 3:56 AM
[16]
Citation Linkhbr.orgPorter, Michael. "How Information Gives You Competitive Advantage". Harvard Business Review. Retrieved 9 September 2015.
Sep 25, 2019, 3:56 AM
[17]
Citation Linkportal.issn.orgMcGowan, Robert (1991). "The work of nations: Preparing ourselves for the 21st century capitalism, by Robert Reich. New York: Knopf Publishing, 1991". Human Resource Management. 30 (4): 535–538. doi:10.1002/hrm.3930300407. ISSN 1099-050X.
Sep 25, 2019, 3:56 AM
[18]
Citation Linkopenlibrary.orgBhagwati, Jagdish N. (2005). In defense of Globalization. New York: Oxford University Press.
Sep 25, 2019, 3:56 AM
[19]
Citation Linkwww.openmarket.org"U.S. Manufacturing : Output vs. Jobs, January 1972 to August 2010 ". BLS and Fed Reserve graphic, reproduced in Smith, Fran. "Job Losses and Productivity Gains", OpenMarket.org, Oct 05, 2010.
Sep 25, 2019, 3:56 AM
[20]
Citation Linkwww.esa.doc.govCooke, Sandra D. "Information Technology Workers in the Digital Economy", in Digital Economy 2003. 2003: Economics and Statistics Administration, Department of Commerce.
Sep 25, 2019, 3:56 AM