This article directory

The complete mind map for this article is summed up in the curtain: portal

The four stages of computer development

First stage: Electron tube (1946-1957)

1. Vacuum tube shape

In the Second World War, Britain to decrypt The German cipher text, the birth of the tube computer.

The world’s first electronic tube computer was the ENIAC

2. Characteristics of vacuum tube computer

  • Small integration, large space occupancy
  • High power consumption and slow operation speed
  • Complex operation, replacement procedures need wiring

Stage 2: Transistor (1957-1964)

Three scientists at Bell LABS invented the transistor

1. Transistor shape

The world’s first transistorized computer: TX-0 (Lincoln Lab, MIT)

Another famous computer of the time was the PDP-1

  • Within 4K, 200,000 instructions per second
  • Configure a 512 x 512 monitor

The PDP-1 was the most powerful computer available at the time, and it had a monitor.

2. Transistor computer features

  • Relatively high integration, relatively small space occupancy
  • Relatively low power consumption, fast operation speed
  • The operation is relatively simple and the interaction is more convenient

Stage 3: Integrated Circuit (1946-1980)

Texas instrument engineers invent the Integrated circuit (IC)

1. Integrated circuit computer shape

2. Features of integrated circuit computers

  • Computers are getting smaller
  • Power consumption gets lower
  • Computations become faster

Because of these characteristics, computers are equipped to enter thousands of homes.

At this stage, the operating system was born.

IBM first introduced two operating systems, 7904 and 1401. But the main features of the two operating systems are different and incompatible with each other. And many users are reluctant to put in two sets of people to work with two operating systems.

To solve these problems, IBM introduced a compatible product System/360. It was sort of a precursor to the modern operating system.

Stage 4: Very large Scale Integrated Circuit (1980-present)

CPU is one of the products of vlSI.

1. Vlsi computer shape

2. Features of vlSI computers

  • There are millions of transistors on a single chip
  • Faster, smaller, cheaper, and more acceptable to the masses
  • More versatile: text processing, table processing, highly interactive games and applications

1. 3. Apple

The computer of the future

1. Biological computers

Protein molecules as the main raw material

Features:

  • Small size, high efficiency
  • Not easily damaged, creature level automatic repair
  • No signal interference, no heat loss

2. Quantum computers

A physical computer that follows quantum mechanics.

Some current research results:

  • In May 2013, Google and NASA released D-Wave-Two
  • In May 2017, the Chinese Academy of Sciences announced the creation of a quantum computer of light
  • In January 2019, IBM demonstrated the world’s first commercial quantum computer

Domestic Internet Companies’ research on quantum computers:

  • Tencent set up its quantum lab in 2017
  • Alibaba established the Dharma Institute in 2017

The development history of microcomputer

The development of the microcomputer began at the integrated circuit stage. The development of microcomputers was largely limited by performance.

1. Single-core CPU Development history:

  • 1971-1973:500 KHz frequency microcomputer (word length 8 bits)
  • 1973-1978: Microcomputers with frequencies above 1 MHz (word length 8 bits)
  • 1978-1985:500 MHz microcomputer (word length 16 bits)
  • 1985-2000: Microcomputers with frequencies above 1 GHz (word length 32 bits)
  • 2000-present: Microcomputers with frequencies above 2 GHz (word length 64 bits)

Moore’s Law

In the 20th century, there was a famous Moore’s Law about the performance of integrated circuits.

The performance of integrated circuits doubles every 18-24 months (proposed by IBM).

But Moore’s Law has broken down in the 21st century. Because as the circuitry of chips gets more complex and denser, the heat loss gets higher and higher, and we can’t solve these problems, we can’t make consistent breakthroughs in performance.

3. Development of multi-core CPUS:

  • 2005: Intel Pentium series dual-core CPU, AMD Velocirons series
  • 2006: Intel Core quad-core CPU
  • Intel Core series 16-core CPU
  • Intel Core series fifty-six core CPU

Third, supplement knowledge points

1. What is the CPU frequency?

The CPU frequency, namely, the clock frequency of the CPU, refers to the operating frequency of the CPU during computing. The unit is Hz.

CPU frequency it determines the speed of the computer, with the development of the computer, the main frequency from the past MHZ development to the current GHZ (1 GHZ = 10^3 MHZ = 10^6 KHZ= 10^9 HZ).

So the higher the CPU frequency, the faster the computer?

Generally speaking, in the same series of microprocessors, the higher the frequency, the faster the computer, but for different types of processors, it can only be used as a reference parameter.

In addition, the computing speed of the CPU depends on the performance indicators of all aspects of the PIPELINE of the CPU.

Because the dominant frequency does not directly represent the computing speed, it is likely that the actual computing speed of a CPU with a higher dominant frequency will be lower under certain circumstances.

Therefore, the dominant frequency is only one aspect of the PERFORMANCE of the CPU, not the overall performance of the CPU.

What is the dominant frequency?

When it comes to processor frequency, there are two concepts closely related to it: frequency doubling and external frequency,

The external frequency is the reference frequency of the CPU, also in MHz. The external frequency is between the CPU and the motherboard synchronization running speed, and most of the computer system foreign frequency is between the memory and the motherboard synchronization running speed, in this way, can be understood as the CPU external frequency directly connected with the memory, to achieve the synchronization between the two running state; Octave is the ratio of the dominant frequency to the external frequency.

Dominant frequency, external frequency and octave, the relation is as follows: dominant frequency = external frequency × octave.

Early cpus did not have the concept of “frequency doubling,” when the main frequency and the system bus were at the same speed.

With the development of technology, the CPU speed is faster and faster, memory, hard disk and other accessories gradually can not keep up with the SPEED of the CPU, and the emergence of frequency doubling to solve this problem, it can make the memory and other components still work in a relatively low system bus frequency, and the CPU’s main frequency can be infinitely improved by frequency doubling (in theory).

We can regard the external frequency as a production line in the machine, and the frequency doubling is the number of production lines, the speed of a machine production speed (main frequency) is naturally the speed of the production line (external frequency) times the number of production lines (frequency doubling).

Manufacturers have basically locked the double frequency, overclocking only from the outer frequency, through the double frequency and the collocation of the outer frequency to the motherboard jumper or set in BIOS soft overclocking, so as to achieve the partial improvement of the overall performance of the computer.

When purchasing, try to pay attention to the external frequency of the CPU.

What is “word length”?

Definition of word length

Computers represent numbers, characters, instructions, and other control information in binary code. When a computer stores, transmits, or operates, a set of binary codes as a unit is called a word, and the number of binary bits in a word is called the word length.

A CPU that processes 8-bit word length data is called an 8-bit CPU. A 32-bit CPU processes 32-bit binary data at the same time.

Each 0 or 1 of a binary is the smallest unit of a binary, called a bit. Common word lengths are 8 – bit, 16 – bit, 32 – bit and 64 – bit. An 8-bit code is called a byte and is the basic coding unit in a computer.

The word length has a great relationship with the function and use of the computer, and is an important technical index of the computer. Word length directly reflects the calculation accuracy of a computer. In order to adapt to different requirements and coordinate the relationship between calculation accuracy and hardware cost, most computers support variable word length operation, that is, half word length, full word length (or single word length) and double word length operation can be realized in the machine.

Other things being equal, the bigger the word, the faster the computer can process the data. Early microcomputer word lengths were generally 8 and 16 bits, while 386 and higher processors were mostly 32 bits. Most of the computers on the market have 64-bit processors.

The word length is determined by the total number of data lines in the external data path of the microprocessor.

References for this article:

  • Moocs programming Essentials: Principles of Computer Composition, Operating Systems, and Computer Networks
  • CPU frequency
  • Word length