Quantum Computing And Digital Evolution



The Computer of Today is based on a concept from the 1940s. Although the shrinking of computer chips has prompted computer developers to study quantum mechanical rules, today's computers still operate purely on classical physics principles. 



  • Tubes and capacitors were used in the earliest computers in the 1940s, and the transistor, which was initially a "classical" component, is still a vital component in any computer today. 
  • The term "transistor" stands for "transfer resistor," which simply indicates that an electrical resistance is controlled by a voltage or current. 
  • The first transistor patent was submitted in 1925. Shortly after, in the 1930s, it was discovered that basic arithmetical operations may be performed by carefully controlling the electric current (for example, in diodes). 
  • The lack of computation speed and energy consumption are the two primary reasons why point contact transistors, triodes, and diodes based on electron tubes are only seen in technological museums today. 
  • Although the components have evolved, the architecture developed by Hungarian mathematician and scientist John von Neumann in 1945 remains the foundation for today's computers. 
  • The memory card, which carries both program instructions and (temporarily) the data to be processed, is at the heart of von Neumann's computer reference model. 
  • A control unit manages the data processing sequentially, that is, step by step, in single binary computing steps. A “SISD architecture” is a term used by computer scientists (Single Instruction, Single Data ). 

Despite the fact that transistors and electron tubes have been replaced with smaller, faster field effect transistors on semiconductor chips, the architecture of today's computers has remained same since its inception. 


How does sequential information processing in computers work? 


Alan Turing, a British mathematician, theoretically outlined the fundamental data units and their processing in 1936. 

The binary digital units, or "bits," are the most basic information units in the system. Because a bit may assume either the state "1" or the state "0," similar to a light switch that may be turned on or off, binary implies "two-valued." 

  • The word "digital" comes from the Latin digitus, which means "finger," and refers to a time when people counted with their fingers. 
  • Today, "digital" refers to information that may be represented by numbers. 
  • In today's computers, electronic data processing entails turning incoming data in the form of many consecutively organized bits into an output that is also in the form of many consecutively ordered bits. 
  • Blocks of individual bits are processed one after the other, much like chocolate bars on an assembly line; for a letter, for example, a block of eight bits, referred to as a "byte," is needed. 
  • There are just two processing options for single bits: a 0 (or 1) stays a 0 (or 1), or a 0 (or 1) transforms to a 1. (or 0). 
  • The fundamental electrical components of digital computers, known as logic gates1, are always the same fundamental fundamental electronic circuits, embodied by physical components such as transistors, through which information is transferred as electric impulses. 
  • The connection of many similar gates allows for more sophisticated processes, such as the addition of two integers. 

Every computer today is a Turing machine: it does nothing but process information encoded in zeros and ones in a sequential manner, changing it into an output encoded in zeros and ones as well. 


  • However, this ease of data processing comes at a cost: to manage the quantity of data necessary in today's complicated computer systems, a large number of zeros and ones must be handled. 
  • The amount of accessible computational blocks improves the processing capacity of a computer in a linear fashion. A chip with twice as many circuits can process data twice as quickly. 
  • The speed of today's computer chips is measured in gigahertz, or billions of operations per second. This necessitates the use of billions of transistors. 
  • The circuitry must be tiny to fit this many transistors on chips the size of a thumb nail. Only thus can such fast-switching systems' total size and energy consumption be kept under control. 
  • The move from the electron tube to semiconductor-based bipolar or field effect transistors, which were created in 1947, was critical for the shrinking of fundamental computing units on integrated circuits in microchips. 
  • Doped semiconductor layers are used to construct these nanoscale transistors. 


This is where quantum mechanics enters the picture. 

  • We need a quantum mechanical model for the migration of the electrons within these semiconductors to comprehend and regulate what's going on. 
  • This is the so-called "band model" of electronic energy levels in metallic conductors. 

Understanding quantum physics was not required for the digital revolution of the twentieth century, but it was a need for the extreme downsizing of integrated circuits.


~ Jai Krishna Ponnappan

You may also want to read more about Quantum Computing here.


Analog Space Missions: Earth-Bound Training for Cosmic Exploration

What are Analog Space Missions? Analog space missions are a unique approach to space exploration, involving the simulation of extraterrestri...