image result for What is a Microprocessor - Definition, Features and More - 2020
Definitions

Microprocessor – Definition, History and it’s Features

Microprocessor Definition

The central integrated circuit of a computer system, where the logical and arithmetic operations (calculations) are carried out to allow the execution of programs, from the Operating System to the Application Software, is called a microprocessor.

A microprocessor can operate with one or more CPUs (Central Processing Units), each consisting of registers, a control unit, an arithmetic-logic unit, and a floating-point calculation unit (or mathematical coprocessor).

Also, the connection of it has generally been by a socket to the motherboard. Along with a heat sink system that makes up certain thermal dissipation materials and a fan cooler (internal fan).

While the same microprocessor may have one or more physical or logical nuclei, in which it carries out all the calculation work, the same computer system may have several processors working in parallel.

The performance of these processors is not easy to measure. But the use of clock frequency (measured in hertz) is to distinguish between the power of one and the other.

Microprocessor History

Microprocessors emerged as a product of the technological evolution of two specific branches: computing and semiconductors. Both had their beginnings in the middle of the 20th century, in the context of the Second World War, with the invention of the transistor, with which vacuum tubes were replaced.

After that, the use of silicon was to generate simple electronic circuits, giving rise later (the beginning of the 1960s) to the creation of the first digital circuits: Transistor-Resistor Logic (RTL), Transistor Logic Diode (DTL), Transistor-Transistor Logic (TTL) and Emitter Complemented Logic (ECL).

The next step towards microprocessors would be the invention of integrated circuits (SSI and MSI), thus allowing the start of the aggregation and miniaturization of components.

The first calculators to use this technology, however, required between 75 and 100 integrated circuits, which was impractical. And so, the next step in reducing computational architecture was the development of the first microprocessors.

The manufacturing of the first processor, Intel 4004 was in 1971. It contained 2300 transistors, and with just 4 bits of capacity. And also, it could perform 60,000 logical operations per second, at a clock frequency of 700 Hz.

After that, the technological race invested in the development of better and more powerful microchips. 8-bit, 16-bit, 32-bit, and 64-bit, currently reaching frequencies above 3 GHz.

Features

Microprocessors resemble a small miniature digital computer, so it presents its own architecture and performs operations under a control program. This architecture is made up of:

  • Encapsulated: A ceramic shell that covers the silicon and protects it from the elements (such as oxygen in the air).
  • Cache: A type of ultrafast memory available to the processor. So that it does not use RAM when necessary. Since, the storage of data is at various levels of the cache memory for immediate recovery.
  • Mathematical co-processor: Called floating-point unit. And it is the portion of the processor that is responsible for logical and formal operations.
  • Records: A short working memory in the processor, designed to keep track of its own operation and conditions.
  • Ports: The conduits that allow the processor to communicate the information with the rest of the system components.

Also Read: Gomovies Review – Sites Like Gomovies, Proxy and Mirror Sites

Related posts

What is a Server? – Definition, Features, Webservers and More

Digital Technology

What is a Protocol? – Definition, Types and Internet Protocol

Digital Technology

Telecommunication – Definition, Types and Telecommunication Networks

Digital Technology

Leave a Comment