Skip to content

Computers in the 1970s: The Decade that Ushered in Personal Computing and Networking as We Know it

Do you remember the old computer models from the 1970s? With their flashing lights, reel-to-reel magnetic tapes and bulky frames occupying entire rooms, those early computing machines seem light years away from the sleek laptops and smartphones we carry around today.

Yet much of the foundation for modern personal computing technology we now take for granted was laid in that pioneering decade. As predicted by Intel founder Gordon Moore, computer processing power doubled every couple years. Tiny microchips emerged that could contain the power of giant early machines. Visionaries began working on networking concepts that would one day form the global Internet.

In the span of just 10 short years, the computer landscape evolved from room-sized number crunchers to a new generation of smaller, more interactive machines accessible to wider audiences than ever before. Read on as we explore the major computing milestones that transformed technology in the 70s.

The Rise of the Microprocessor

The microprocessor – tiny integrated circuits that contained the logic power of computers in a compact chip – truly took off commercially in the early 1970s. These microchips enabled computing devices to become far smaller and more affordable than ever before.

In November 1971, Intel unveiled their 4004 chip – the world‘s first commercial 4-bit microprocessor. Small enough to fit on your fingertip, it nonetheless packed the equivalent of 2,300 transistors, performing 60,000 operations per second.

Microprocessor Year Released Bit Capacity Transistor Count Clock Speed
Intel 4004 1971 4-bit 2,300 740 kHz

While basic compared to today‘s hardware, Intel‘s 4004 microprocessor was absolutely groundbreaking for miniaturizing computer processing capabilities at the time. It opened the floodgates for many companies to begin shrinking down the size of computer hardware through microchip integration.

Over the first half of the decade, Intel and competitors released increasingly sophisticated microprocessors for what would become the foundation of smaller, cheaper computing systems:

  • Intel launched the 8008 8-bit microprocessor in 1972 and the much more powerful 8080 8-bit microprocessor in 1974, clocking up to 2 million cycles per second.

  • Rival Zilog introduced the Z80 8-bit microprocessor in 1976, running at 4 million cycles per second. The Z80 later powered many early home computers and gaming consoles.

Thanks to these early pioneers, the era of the microprocessor had clearly arrived by the mid-1970s. But these tiny chips needed something to power – which led to a new revolution in microcomputers as the decade went on…

Microcomputers Emerge for Consumers and Businesses

The advent of the microprocessor allowed computer companies to shrink down the size and scope of computing machines to create affordable, practical microcomputers for various uses. Although limited in capabilities compared to modern systems, these pioneering devices brought digital computing into more hands than ever before.

For example, the French-made Micral N released in February 1973 is considered one of the earliest commercial non-kit microcomputers. Developed by Réalisations & Etudes Electroniques (R2E), the Micral N packed a variant of Intel‘s 8008 microchip along with 92 kB of magnetic core memory. It all fit neatly within a centralized processing unit case and monitor. Costing around $1,450, the Micral N found niche success among European businesses and industry.

Photograph showing the Micral N microcomputer introduced in 1973

Then in January 1975, an Albuquerque-based company called MITS introduced a microcomputer specifically aimed at the blossoming computer hobbyist scene, called the Altair 8800. Priced at just $439 as an assemble-it-yourself kit, it became an unlikely smash hit.

The Altair 8800 computer actuated switches and blinked lights to output data, reminiscent of early calculator interfaces. Yet its Intel 8080 CPU allowed enthusiasts to program the device however they liked using binary machine code or the built-in BASIC language interpreter. Over 10,000 Altair kit computers were sold in the first year – startling demand that took even its creators by surprise!

Of course, 1975 also marked the advent of the company most synonymous with personal computing – Apple Inc. College students Steve Wozniak and Steve Jobs started Apple Computer out of a California garage to produce Wozniak’s hand-built Apple I computer board. Targeted toward fellow electronics hobbyists, it was sold as a $666.66 unchanged motherboard, needing customers to supply their own case, keyboard, monitor and power supplies.

With only 200 Apple I devices produced, it barely made a commercial ripple. Yet the experience paved the way for Wozniak to craft a much more refined follow-up product – one that would truly democratize personal computing for the masses.

In April 1977, Apple formally incorporated and debuted the landmark Apple II personal computer system complete with integrated keyboard, graphics/sound, BASIC software and color display. It became a runaway success as one of the first affordable, consumer-friendly home computers on the mainstream market. By May 1978, over 35,000 Apple II machines had been snatched up by eager customers.

Thanks to iconic early computing products like these and myriad competitors playing out across the decade, microcomputers had fully arrived – no longer solely the domain of governments, research institutions and big corporations. The ripples of which would continue to spread through society and technology sectors in the coming decades.

Networking and Internet Protocols Emerge

As engineers worked to shrink down computing machines themselves, equally important advances were happening in how computers could communicate with one another. Throughout the 1970s, much of the early foundation work enabling modern computer networking and the Internet itself began to form.

In the fall of 1973, American engineers Vinton Cerf and Bob Kahn began collaborating on an experimental project for the U.S. Defense Department. They sought to solve an emerging challenge – how to connect various computer networks built on incompatible communications protocols so all users could seamlessly share data and access applications.

Their solution was TCP/IP (Transmission Control Protocol and Internet Protocol) published as a joint paper in May 1974 – a model allowing digital packets of data to transmit reliably between disparate networks. Building on concepts like packet-switching from prior efforts, TCP/IP provided common data formatting and transmission rules for radically different underlying networks to exchange information.

Concurrent to Cerf and Kahn’s work, electrical engineer Bob Metcalfe at Xerox‘s Palo Alto Research Center (PARC) started developing a similar concept he called “Ethernet” for connecting computers together on a localized network. His 1973 memo on the idea evolved into experimental Ethernet networks set up at PARC by 1975.

Table showing growth of ARPANET/Internet connected computer networks

These and related efforts like IBM’s SNA protocol meant computers could suddenly connect to share information and access shared resources across building networks, institutional networks and even continent-wide networks like the pioneering 1969 ARPANET.

As shown above, from just 4 initial ARPANET nodes in 1969, hundreds of computer networks containing thousands of connected machines stretched across North America and Europe by the late 1970s – the true beginnings of what we now know as the global Internet.

Graphical User Interfaces Emerge from Xerox PARC

While most microcomputers of the 1970s still relied on blinking lights or text-based interfaces, visionaries at Xerox’s Palo Alto Research Center (PARC) saw the potential for much more intuitive graphical interfaces. Their pioneering work at PARC throughout the early 1970s would go on to heavily influence the evolution of modern personal computer and OS design.

In 1973, Xerox PARC researchers began developing the Alto – one of the first personal computer systems centered around a graphical desktop metaphor. Using a three-button mouse for input, the Alto could manipulate on-screen windows, icons and menus well before such concepts entered the mainstream tech lexicon.

Although it remained an internal prototype barely seen outside Xerox, the Alto illustrated how personal computing could become dramatically more visual and intuitive compared to then-traditional text-heavy interfaces.

Building on their learnings from Alto and other interim projects, PARC researchers conceived the Xerox Star workstation throughout the late seventies. When it finally launched in 1981, the Star incorporated what we now consider staple GUI features – desktop metaphor, icons, drag-and-drop manipulation, overlapping windows, cursor and point device control.

Xerox Star 1981 Workstation with graphical user interface

Thus while largely underappreciated in their day, PARC‘s pioneering efforts throughout the 1970s to evolve personal computing from a text-oriented task to graphical-oriented workflow truly shaped the direction of user interfaces for decades to come.

Microsoft and Apple Are Born – Personal Computing Heavies

The mid-1970s marked the humble beginnings of two fateful corporate entities that would massively shape the trajectory of personal computing technology for decades – Microsoft and Apple.

In early 1975, Harvard dropout Bill Gates teamed up with high school buddy Paul Allen to co-found a small software company called Micro-Soft (later shortened to Microsoft). Sensing opportunity as microcomputers like the Altair 8800 started gaining steam among hobbyists, the young entrepreneurs believed they could provide key software to personal computing platforms.

Their first landmark product was a BASIC language interpreter that IBM commissioned Micro-soft to create for its upcoming PC line in late 1975 – allowing users to easily program applications themselves. This pivotal deal cemented Microsoft‘s strategy of supplying mission-critical software like computer languages and OS platforms to third-party hardware makers.

Of course, 1975 also marked childhood friends Steve Jobs and Steve Wozniak‘s founding of Apple in a family garage, starting with hand-built microcomputer boards for fellow electronics enthusiasts. But it was 1977‘s launch of the Apple II that truly put Apple on the map as a major new technology manufacturer.

Unlike hobbyist kits before it, the Apple II shipped as a fully integrated consumer appliance with colorful graphics and sound capabilities, BASIC software and integrated peripherals like keyboard and tape storage. Thanks to its friendly packaging and mass retail strategy, Apple II series sales skyrocketed through the late seventies – from 2,500 units shipped in 1977 to over 78,000 units shipped in 1979.

Line chart showing rising Apple II computer shipments from 1977 to 1979

Fueled by the rising accessibility of microcomputers across homes and schools, Apple rapidly grew into a leading personal computing brand. By the end of the decade, Apple Computer traded on the stock market at a $1.79 billion valuation with 1,000+ employees.

Though both were rather humble, garage-based startup stories in the mid-70s, Microsoft and Apple ended the decade well poised to become dominant forces steering the future of consumer technology thanks to their remorseless founder-leaders.

Mainframes and Supercomputers Still Crucial for Research

While smaller microcomputers grabbed lots of headlines as the flashy new thing, mainframe computing remained absolutely vital across scientific research, engineering, academia and large commercial data processing throughout the 1970s. Supercomputers also pushed new frontiers in computing power for organizations with massive computational demands.

For most of the decade, the 1969-released Control Data 7600 system remained the reigning fastest supercomputer globally – still outpacing newer rivals at the dawn of the eighties with up to 10 million instructions per second throughput. It enabled government and research institutions to model highly complex problems like nuclear weapons performance and microscopic molecular dynamics that microchips couldn‘t yet approach.

In 1976, Cray Research founder and former CDC engineer Seymour Cray unveiled his namesake, the Cray-1 system – the first commercial "vector processor" supercomputer leveraging new architecture for blazing performance. Priced from $5-$8 million, nearly 80 Cray-1 units sold to elite organizations needing world-class number crunching power before it was discontinued in 1984. For example, NASA relied on its Cray-1 systems for intensive aerodynamic and weather simulations supporting Space Shuttle development throughout the late seventies.

Table comparing select 1970s-era supercomputer processing and memory stats

Therefore while compact new machines from Apple, Commodore and Atari may have stoked mainstream tech imagination in the seventies, state-of-the-art computing largely remained driven by million-dollar mainframes and supercomputers outputs for awhile yet. But no longer only accessible to white coated academics or government spooks, personal computerswere poised to truly storm the work scene in the coming 1980s.

Computers Arrive in the White House

On May 9, 1978, U.S. President Jimmy Carter oversaw the first computer installation for White House administrative operations – a Hewlett Packard 3000 system with multiple terminals, data storage and printing capabilities. While not connected to external networks, the HP 3000 amplified and modernized internal White House organizational functions from keeping legislative schedules to administering office budgets.

Prior presidents since Kennedy had funded various federal computer initiatives and academic research projects. But Carter was the first to upgrade executive office equipment from typewriters and mechanical calculators to networked digital computers. This mirrored a broader wave of traditional large organizations and major corporations transitioning their own operations from analog paper workflows to networked computing systems throughout the late seventies. Within a few years, most white collar office jobs would involve interacting with business computers in some daily capacity.

New Data Storage Options Emerge to Complement Tape

While unwieldy open-reel magnetic tapes and punch cards continued as mainstream digital storage media at most commercial data centers and research mainframes, new high-density mass storage options emerged in the seventies offering other capabilities.

In 1973, IBM released a specialized new "virtual storage" device – the 3340 hard disk drive. Befitting its enterprise market, the 3340 HDD boasted a then-cavernous 70 MB formatted storage and data transfer rates exceeding 8 MB a second, thanks to large fixed platter design leveraging new drive innovations like servo track following.

Although extremely expensive ($52,000 per 70MB drive) and physically huge by modern standards, IBM‘s new 3340 HDD could nonetheless outperform most existing tape drives – introducing handy new capabilities like rapid random access and granular file storage. Their monumental capacities and performance opened new options for I/O-intensive commercial computing applications of the era.

Chart showing rising typical hard disk drive capacities from 1973 - 1979

Hard disk units remained specialty equipment, but their rising densities shown above complemented existing mainstream tape drives across most data centers through 1979. Yet they were still far too large or pricey for feasibility outside an institutional computing context. However, that began changing late in the decade.

Following IBM‘s 1976 patent revealing concepts for miniaturizing HDDs enough to fit handheld devices, companies started shrinking hard drives toward ubiquitious consumer viability in coming years – reaching 10+ gigabytes in laptop sizes before the nineties concluded.

On the removable storage front, compact cassette tapes and later floppy disks also arose as handy portable media for transferring files between home and work computers by 1979 – the most personal data carriage then available. However these devices initially held too little content for most business systems (mere kilobytes) until densities grew in subsequent decades.

Lastly – with the rise of consumer electronics like musical devices, video recorders and now personal computers all needing reliable, dense data storage – new optical disc research unlocked a breakthrough.

In 1978, Sony publicly demonstrated a first-of-kind consumer product storing digital data as microscopic pits engraved by laser on an optical vinyl platter. Their new 12cm "Compact Disc" prototype could hold an hour of music with 500 MB storage capacity. While not a computing format initially, CD‘s immense potential was obvious and it would rapidly spread as a personal computer storage medium in the coming 1980s.

Conclusion

As we‘ve explored, the 1970s bore witness to an incredible range of foundational computing milestones – from hardware miniaturization and more approachable microcomputers for businesses and hobbyists; through graphical interface and desktop metaphor concepts; to packet networking research and early Internet protocols…all accompanied by vastly expanded capabilities for processing speeds, capacities and connectivity Scope between networked machines.

These seminal technological breakthroughs radically reshaped what computers could achieve and who could use them on a practical level as the decade concluded. While behemoth number-crunching mainframes still dominated high performance computing realms; visionary researchers and entrepreneurs had successfully broken computing free of strictly scientific or financial institutional confines toward a more personal, soon desktop-centric landscape.

By the end of the seventies, key ingredients now mingled at scale for true exponential growth around affordability, accessibility and connectivity – setting the stage for an eventual workplace and household personal computing revolution to unfold moving into the moxie-filled 1980s.