Is NASA On The Lookout For Aliens?





    The hunt for extraterrestrial life is one of NASA's main objectives. 


    NASA has yet to discover any convincing evidence of alien life, but it has long been investigating the solar system and beyond to help us answer basic issues such as whether we are alone in the cosmos. 

    The astrobiology program of the agency studies the origins, development, and dispersion of life beyond Earth. 

    NASA's scientific missions are working together to discover unambiguous evidence of life beyond Earth, from investigating water on Mars to exploring potential "oceans worlds" like Titan and Europa, to searching for biosignatures in the atmospheres of our cosmic neighborhood and planets beyond our solar system. 



    Is there a chance that life exists anywhere else than Earth? 



    There is a chance, if not a certainty, that life exists somewhere other than Earth. Science is motivated by a desire to learn more about the unknown - yet science is ultimately based on evidence, and alien life has yet to be discovered. We will, however, continue our search. 



    Do intelligent extraterrestrials exist? 


    There is no known evidence for sentient life elsewhere, intelligent or otherwise, based on study at the SETI Institute, examination of Martian meteorites, new discoveries of methane inside the Mars atmosphere, and other similar investigations. 

    The hunt for life in the cosmos, on the other hand, is one of NASA's main objectives. 

    NASA is in charge of the US government's hunt for alien life, whether it's here on Earth, on the planets and moons of our solar system, or farther out in space. 



    How does NASA go about looking for life? 


    The hunt for life at NASA is complex. The research approach for NASA's astrobiology program focuses on three fundamental questions: 


      • What is the origin of life and how does it progress? 
      • Is there life somewhere else in the universe? 
      • What methods do we use to look for life in the universe? 

    • Astrobiologists have discovered a plethora of hints to these major issues during the last 50 years. In addition to utilizing missions like the Transiting Exoplanet Survey Satellite (TESS) and the Hubble Space Telescope to look for habitable exoplanets, NASA's hunt for life involves using the Transiting Exoplanet Survey Satellite (TESS) and the Hubble Space Telescope. 
    • Missions such as the forthcoming James Webb Space Telescope will look for biosignatures in the atmospheres of other planets - finding oxygen and carbon dioxide in other planets' atmospheres, for example, may indicate that an exoplanet supports plants and animals in the same way as ours does. 



    Is NASA on the lookout for technosignatures? 


    Technosignatures are a kind of biosignature that is defined as any observable indication of living or dead organisms. 

    • Technosignatures are technological indicators that may be used to infer the presence of intelligent life elsewhere in the cosmos, such as narrow-band radio transmissions or pulsed laser searches for alien intelligence. 


    The terms SETI (Search for Extraterrestrial Intelligence) and technosignatures are often used interchangeably. 


    • NASA funds technosignatures research, but not ground-based radio-telescope searches, owing to NASA's policy of supporting astrophysical research using space-based assets. 
    • NASA also sponsored a Topical Workshops, Symposia, and Conference to create a research agenda to prioritize and direct future theoretical and observational investigations of non-radio technosignatures, as well as to produce a publishable report that can be used to start creating a technosignatures library. 

    Given that a planet may support life for billions of years before intelligent life evolves to create technology that can be detected from other solar systems – our own planet, for example, has only been creating detectable technosignatures for a little over a century – we have a much better chance of finding life if we look for other biosignatures instead of just technosignatures. 



    Is NASA looking for or studying UAPs (Unidentified Aerial Phenomena)?


    NASA does not go out of its way to look for UAPs. NASA, on the other hand, gathers significant data about Earth's atmosphere via our Earth-observing satellites, frequently in cooperation with other international space organizations. 


    • While these data are not intentionally gathered to detect UAPs or extraterrestrial technosignatures, they are publicly accessible and anybody may scan the atmosphere with them. 
    • While NASA does not actively look for UAPs, if they are discovered, it will offer up new scientific topics to investigate. 
    • Scientists from the atmosphere, aerospace, and other fields may all contribute to a better understanding of the phenomena. 


    Exploring the unknown in space is fundamental to our identity.


    Courtesy: NASA.gov


    ~ Jai Krishna Ponnappan

    You may also want to read more about Space Missions and Systems here.


    How Does NASA's Perseverance Rover Take Selfies On Mars?



      The historic photo of the rover next to the Mars Helicopter turned out to be one of the most difficult rover selfies ever shot. 




      The procedure is explained in detail in this video, which also includes additional audio. 





      Have you ever wondered how rovers on Mars snap selfies? 


      NASA's Perseverance rover took the historic April 6, 2021, picture of itself alongside the Ingenuity Mars Helicopter in color video. 

      The sound of the arm's motors spinning was recorded by the rover's entry, descend, and landing microphone as an added bonus. 


      Engineers may use selfies to evaluate the rover's wear and tear. They do, however, inspire a new generation of space aficionados: 


      • Many members of the rover crew may recall a favorite picture that first piqued their interest in NASA. 
      • Vandi Verma, Perseverance's lead engineer for robotic operations at NASA's Jet Propulsion Laboratory in Southern California, stated, "I got into this when I saw a photo from Sojourner, NASA's first Mars rover." 
      • Verma served as a driver for the agency's Opportunity and Curiosity rovers, and she was involved in the first selfie taken by Curiosity on Oct. 31, 2012. 
      • “We had no idea when we snapped that first selfie that these would become so iconic and routine,” she added. 
      • The rover's robotic arm twists and maneuvers to capture the 62 pictures that make up the image, as shown on video from one of Perseverance's navigation cameras. 
      • What it doesn't show is how much effort went into creating the first selfie. Let's take a deeper look. 






      Teamwork. 


      Perseverance's selfie was made possible by a core group of approximately a dozen individuals, including rover drivers, JPL engineers who conducted tests, and camera operations engineers who created the camera sequence, analyzed the pictures, and stitched them together. 


      It took approximately a week to plan out all of the necessary individual instructions. 

      • Everyone was working on “Mars time,” which meant being up in the middle of the night and catching up on sleep throughout the day (a day on Mars is 37 minutes longer than on Earth). 
      • These members of the crew would occasionally forego sleep in order to complete the selfie. JPL collaborated with Malin Space Science Systems (MSSS) in San Diego, which designed and operated the selfie camera. 




      The camera, dubbed WATSON (Wide Angle Topographic Sensor for Operations and eNgineering), is intended for close-up detail pictures of rock textures rather than wide-angle images. 


      • Engineers had to order the rover to snap hundreds of separate pictures to create the selfie since each WATSON image only captures a tiny part of a scene. 
      • Mike Ravine, MSSS's Advanced Projects Manager, stated, "The thing that required the greatest care was putting Ingenuity into the proper position in the selfie." 

      “Considering how tiny it is, I think we did fairly well.” The MSSS image processing experts got to work as soon as the pictures from Mars arrived. 


      • They begin by removing any imperfections produced by dust that has collected on the light sensors of the camera. 
      • They next use software to combine the individual picture frames into a mosaic and smooth out the seams. 
      • Finally, an engineer warps and crops the mosaic to make it seem more like a standard camera picture that the general public is familiar with. 






      Simulations on a computer. 



      Perseverance, like the Curiosity rover (seen taking a selfie in this black-and-white video from March 2020), has a spinning turret at the end of its robotic arm. 


      • The WATSON camera, which remains focused on the rover during selfies while being tilted to record a portion of the landscape, is housed in the turret among other scientific equipment. 
      • The arm serves as a selfie stick in the final result, staying just out of frame. 
      • Perseverance is considerably more difficult to get to video its selfie stick in action than Curiosity. 
      • Perseverance's turret is 30 inches (75 centimeters) wide, compared to Curiosity's 22 inches (55 centimeters). 
      • That's the equivalent of waving a road bike wheel a few millimeters in front of Perseverance's mast, the rover's "head." 
      • JPL developed software to prevent the arm from colliding with the rover. 
      • The engineering team changes the arm trajectory every time a collision is detected in simulations on Earth; the procedure is repeated hundreds of times to ensure the arm motion is safe. 
      • The last instruction sequence brings the robotic arm as near to the rover's body as possible without touching it. 

      Other simulations are performed to verify that the Ingenuity helicopter is properly positioned in the final photo, or that the microphone can catch sound from the robotic arm's motors, for example. 





      Microphone Onboard




      Perseverance has a microphone in its SuperCam instrument in addition to its entrance, descent, and landing microphones. 


      • The microphones are a first for NASA's Mars mission, and audio will be a valuable new tool for rover engineers in the coming years. 
      • It may be used to give crucial information about whether something is functioning properly, among other things. 
      • Engineers used to have to make do with listening to a test rover on Earth. 


      “It's like your car: even if you're not a technician, you may hear an issue before you know there's a problem,” Verma said. 


      The humming engines sound strangely melodic when echoing through the rover's chassis, despite the fact that they haven't heard anything alarming thus yet. 





      More Information about the Mission. 



      • Astrobiology, particularly the hunt for evidence of ancient microbial life, is a major goal for Perseverance's mission on Mars. 
      • The rover will study the planet's geology and climatic history, lay the path for human exploration of Mars, and be the first mission to gather and store Martian rock and regolith (broken rock and dust). 
      • Following NASA missions, in collaboration with the European Space Agency (ESA), spacecraft would be sent to Mars to collect these sealed samples from the surface and return them to Earth for further study. 


      The Mars 2020 Perseverance mission is part of NASA's Moon to Mars exploration strategy, which includes Artemis lunar missions to assist prepare for human exploration of Mars. 


      The Perseverance rover was constructed and is operated by JPL, which is administered for NASA by Caltech in Pasadena, California. 



      For additional information about Perseverance, go to: 

      mars.nasa.gov/mars2020/ 

      nasa.gov/perseverance


      Courtesy: NASA.gov


      ~ Jai Krishna Ponnappan

      You may also want to read more about Space Missions and Systems here.



      Quantum Computing Hype Cycle



        Context: Quantum computing has been classified as an emerging technology since 2005.





        Because quantum computing has been on the Gartner Hype Cycle up-slope for more than 10 years, it is arguably the most costly and hardest to comprehend new technology. 


        Quantum computing has been classified as an emerging technology since 2005, and it is still classified as such.

        The idea that theoretical computing techniques cannot be isolated from the physics that governs computing devices is at the heart of quantum computing





        Quantum physics, in particular, introduces a new paradigm for computer science that fundamentally changes our understanding of information processing and what we previously believed to be the top limits of computing



        If quantum mechanics governs nature, we should be able to mimic it using QCs. 

        The executive summary depicts the next generation of computing.




         

        Quantum Computing On The Hype Cycle.


        Since the hype cycle for quantum computing had been first established by Gartner, Pundits have predicted that it will take over and permanently affect the world. 

        Although it's safe to argue that quantum computers might mark the end for traditional cryptography, the truth will most likely be less dramatic. 

        This has obvious ramifications for technology like blockchain, which are expected to power future financial systems. 

        While the Bitcoin system, for example, is expected to keep traditional mining computers busy until 2140, a quantum computer could potentially mine every token very instantly using brute-force decoding. 



        Quantum cryptography-based digital ledger technologies that are more powerful might level the playing field. 




        All of this assumes that quantum computing will become widely accessible and inexpensive. As things are, this seems to be feasible. 

        Serious computer companies such as IBM, Honeywell, Google, and Microsoft, as well as younger specialty startups, are all working on putting quantum computing in the cloud right now and welcoming participation from the entire computing community. 

        To assist novice users, introduction packs and development kits are provided. 

        These are significant steps forward that will very probably accelerate progress as users develop more diversified and demanding workloads and find out how to handle them with quantum technology. 

        The predicted democratizing impact of universal cloud access, which should bring more individuals from a wider diversity of backgrounds into touch with quantum to comprehend, utilize, and influence its continued development, is also significant. 




        Despite the fact that it has arrived, quantum computing is still in its infancy. 


        • Commercial cloud services might enable inexpensive access in the future, similar to how scientific and banking institutions can hire cloud AI applications to do complicated tasks that are invoiced based on the amount of computer cycles utilized now. 
        • To diagnose genetic problems in newborn newborns, hospitals, for example, are using genome sequencing applications housed on AI accelerators in hyperscale data centers. The procedure is inexpensive, and the findings are available in minutes, allowing physicians to intervene quickly and possibly save lives. 
        • Quantum computing as a service has the potential to improve healthcare and a variety of other sectors, including materials science. 
        • Simulating a coffee molecule, for example, is very challenging with a traditional computer, requiring more than 100 years of processing time. The work can be completed in seconds by a quantum computer. 
        • Climate analysis, transit planning, biology, financial services, encryption, and codebreaking are some of the other areas that might benefit. 
        • Quantum computing, for all of its potential, isn't come to replace traditional computing or flip the world on its head. 
        • Quantum bits (qubits) may hold exponentially more information than traditional binary bits since they can be in both states, 0 and 1, but binary bits can only be in one state. 
        • Quantum, on the other hand, is only suitable for specific kinds of algorithms since their state when measured is determined by chance. Others are best handled by traditional computers. 





        Quantum computing will take more than a decade to reach the Plateau of Productivity.




        Because of the massive efficiency it delivers at scale, quantum computing has caught the attention of technological leaders. 

        However, it will take years to develop for most applications, even if it makes limited progress in highly specialized sectors like materials science and cryptography in the short future. 


        Quantum approaches, on the other hand, are gaining traction with specific AI tools, as seen by recent advancements in natural language processing that potentially break open the "black box" of today's neural networks. 




        • The lambeq kit, sometimes known as lambeq, is a traditional Python repository available on GitHub. 
        • It coincides with the arrival to Cambridge Quantum of well-known AI and NLP researchers, and provides an opportunity for hands-on QNLP experience. 
        • The lambeq program is supposed to turn phrases into quantum circuits, providing a fresh perspective on text mining, language translation, and bioinformatics corpora. It is named after late semantics scholar Joachim Lambek. 
        • According to Bob Coecke, principal scientist at Cambridge Quantum, NLP may give explainability not feasible in today's "bag of words" neural techniques done on conventional computers. 





        These patterns, as shown on schema, resemble parsed phrases on elementary school blackboards. 

        Coecke told that current NLP approaches "don't have the capacity to assemble things together to discover a meaning." 


        "What we want to do is introduce compositionality in the traditional sense, which means using the same compositional framework. We want to reintroduce logic." 

        Honeywell announced earlier this year that it would merge its own quantum computing operations with Cambridge Quantum to form an independent company to pursue cybersecurity, drug discovery, optimization, material science, and other applications, including AI, as part of its efforts to expand quantum infrastructure. 

        Honeywell claimed the new operation will cost between $270 million and $300 million to build. 


        Cambridge Quantum said that it will stay autonomous while collaborating with a variety of quantum computing companies, including IBM. 

        In an e-mail conversation, Cambridge Quantum founder and CEO Ilyas Khan said that the lambeq work is part of a larger AI project that is the company's longest-term initiative. 

        In terms of timetables, we may be pleasantly pleased, but we feel that NLP is at the core of AI in general, and thus something that will truly come to the fore as quantum computers scale," he added. 

        In Cambridge Quantum's opinion, the most advanced application areas are cybersecurity and quantum chemistry. 





        What type of quantum hardware timetable do we expect in the future? 




        • Not only is there a well-informed agreement on the hardware plan, but also on the software roadmap (Honeywell and IBM among credible corporate players in this regard). 
        • Quantum computing is not a general-purpose technology; we cannot utilize quantum computing to solve all of our existing business challenges.
        • According to Gartner's Hype Cycle for Computing Infrastructure for 2021, quantum computing would take more than ten years to reach the Plateau of Productivity. 
        • That's where the analytics company expects IT users to get the most out of a certain technology. 
        • Quantum computing's current position on Gartner's Peak of Inflated Expectations — a categorization for emerging technologies that are deemed overhyped — is the same as it was in 2020.


        ~ Jai Krishna Ponnappan

        You may also want to read more about Quantum Computing here.



        Quantum Computing - What Exactly Is A Qubit?


        While the idea of a qubit has previously been discussed, it is critical to remember that it is the basic technology of any quantum computing paradigm, whether adiabatic or universal. 


        • A qubit is a physical device that acts as a quantum computer's most basic memory block. 
        • They are quantum versions of the classical bits (transistors) used in today's computers and smartphones. 
        • Both bits and qubits have the same objective in mind: to physically record the data that each computer is processing. 
        • The bit or qubit must be modified to reflect the change in information as it is altered throughout computation. 
        • This is the only way the computer will be able to keep track of what is going on.
        • Because quantum computers store information in quantum states (superpositions and entanglement states), qubits must be able to physically represent these quantum states. 
        • This is difficult since quantum events only occur in the most severe circumstances. 



        To make matters worse, quantum phenomena are natural occurrences in the proper context. 

        Such events may be triggered by anything from a beam of light to a change in pressure or temperature, which can excite the qubit into a different quantum state than planned, distorting the information the qubit was supposed to contain. 


        • To address these issues, scientists place quantum computers in extremely controlled environments, such as temperatures no higher than 0.02 Kelvin — 20,000 degrees colder than outer space — in nearly an empty vacuum — 100 trillion times lower than atmospheric pressure — and either extremely light or extremely strong magnetic fields, depending on the circumstances. 
        • All of this effort is aimed at allowing a qubit candidate to participate mainly in superposition states. 
        • The core of quantum computing is this event, which allows qubits to store not just 0 or 1 but also a superposition of 0 and 1. 
        • These memory blocks can store considerably more information than their binary counterparts because each qubit may have many states – potentially infinite states (classical bits). 
        • As a result, quantum computers can do computations considerably more quickly.


        ~ Jai Krishna Ponnappan

        You may also want to read more about Quantum Computing here.



        A Brief History Of Quantum Computers



        Computers were formerly a tangled mass of wiring, tube, and metal that weighed tons and took up huge rooms long before they were downsized into the MacBooks and PCs that now abound in commercial usage. 


        • They started off as task-specific calculators. 
        • These computers, known as analog computers, range from simple abacuses to more advanced systems that resemble contemporary computers. 
        • They could compute gunfire range, trajectory, and deflection data, as well as automate temperature and pressure flow in factories and aircraft, for example. 

        The basic difference between an analog computer and a digital computer is how information is processed. 

        • Analog computers represent information by simulating the issue they are supposed to solve using a physical model. 
        • Analog computers are restricted to single jobs because the issue is built into the machine's architecture. 
        • Digital computers, on the other hand, use symbols to represent quantities and information. 
        • Because symbolic nature is adaptable, it may be reconstructed for various issues on a regular basis. 
        • It's the difference between an abacus, which uses beads and slides to represent numbers, and a smartphone calculator, which crunches numbers as binary values transmitted via a processor chip. 
        • It's the difference between a music record (on which sounds are etched) and a smart phone's music application (where data is encoded as binary values). 
        • It's worth noting that digital computers didn't always outperform their analog predecessors. 
        • Indeed, digital computers are the industry norm today, and they are built from the ground up to have much greater potential than analog computers. 
        • However, analog computers were considered a competitive option to digital computers in many areas, particularly industrial process control, before that potential was fully realized. 


        Both technologies were continuously improving, and until digital computers progressed far enough to surpass analog, the technological frontier was built on digital–analog hybrid systems like those used in NASA's Apollo and Space Shuttle projects. 

        The digital revolution did not begin until the 1980s, with the development and subsequent mass manufacturing of the silicon transistor and microprocessor. 

        It took 25 years to get from pure analog to 100% digital. 


        The quest to build the first functioning quantum computer today follows a similar evolutionary path. Adiabatic QCs (or AQC) are the analog counterpart of QCs, with research and development led by a Canadian firm, D-Wave Systems, and the US Intelligence Advanced Research Projects Activity (IARPA). 

        Computers that, like digital computers today, use logic gates on various qubits to do computations are on the opposite end of the spectrum. Universal QCs are what they're called (or UQC).


        ~ Jai Krishna Ponnappan

        You may also want to read more about Quantum Computing here.



        Quantum Computer Physics



        Computers, no matter how sophisticated they have gotten over the last century, still rely on binary choices of 0 and 1 to make sense of the chaos around us. 


        However, as our knowledge of the world grows, we become increasingly aware of the limits of this paradigm. 



        Quantum mechanics advancements continue to remind us of our universe's unfathomable complexity. The ideas of superposition and entanglement are at the heart of this rapidly growing area of physics. 

        • Simply stated, this is the notion that subatomic particles such as electrons may exist in many locations at the same time (superposition) and can seem to interact across apparently empty space (entanglement). 
        • These phenomena offer a one-of-a-kind physical mechanism for analyzing and storing data at rates that are orders of magnitude quicker than traditional computers. 
        • QCs, which were originally proposed in 1980, are now widely regarded as the technology to achieve this goal. 
        • The concept behind quantum computer bits (or qubits) is that they may store information not just as 0s or 1s, but also as a superposition of both 0 and 1 – theoretically endless permutations of numbers between 0 and 1. 
        • As a result, each quantum-bit is endowed with enormous quantities of data. Imagine the potential of a machine that can access millions of superpositions between 0 and 1 if computers today can do so much with only two states. 
        • QCs will be able to compute information much faster, shattering our present data processing limitations. 


        They're the means of bringing artificial intelligence, risk analysis, optimization, and a slew of other technologies to fruition that we've long envisioned. 

        They are the logical successor to the contemporary computer, which has characterized the information era, for many new jobs. 

        This has ramifications for brain degenerative illnesses, energy, agriculture, economics, biochemistry, and a variety of other fields of research. 


        ~ Jai Krishna Ponnappan

        You may also want to read more about Quantum Computing here.



        Quantum Computing - Transition from Classical to Quantum Computers (QCs)


        Since the invention of the computer in the 1930s, we have been able to build economic, social, and technical models for many areas of life.

        The binary system is used in these machines. This implies that data is represented as a string of 0s or 1s, with each letter having to be a binary option of 0 or 1 without ambiguity. 



        Computers need a matching physical mechanism to represent this data. Consider this system as a set of switches, one in each direction indicating a 1 and the other a 0. On today's microprocessors, there are billions of these switches. 



        Information is stored in the form of strings of 0s and 1s, which are then processed, evaluated, and computed using logic gates. 


        These are transistors that have been linked together. Logic gates are the basic building blocks for the massive calculations we ask modern computers to do, and they may be linked together hundreds of millions of times to execute sophisticated algorithms.


        ~ Jai Krishna Ponnappan

        You may also want to read more about Quantum Computing here.



        How Can We Live in Peace With AI?


        The limbic cortex is a region of the brain that neuroanatomists believe is the seat of emotion, addiction, mood, and a variety of other mental and emotional processes. 


        The limbic system's Amygdala, which is responsible for basic survival impulses like fear and aggressiveness, is also known as "The Lizard Brain" or "The Reptilian Brain." 



        • This is because a lizard's limbic system is its only source of brain function. 
        • The lizard brain is why you're scared, why you don't make all the art you can, why you don't ship when you can. 
        • The lizard brain is the root of the resistance. 
        • The lizard brain is famished, terrified, enraged, and horny. 
        • The lizard brain is primarily concerned with eating and staying safe. 

        Because status in the group is necessary for survival, the lizard brain is concerned with what others think. 


        The greatest line in Barbara Tuchman's 1961 book The Guns of August sums up the inability to plan for a lengthy World War I, among other mistakes: 


        "The inclination of everyone on both sides was not to prepare for the three tougher alternatives, not to act upon what they knew to be true." She's also discovered "Tuchman's rule," a historical occurrence that has been recognized as a psychological principle of "perceptual readiness" or "subjective probability": 

        Disasters are seldom as widespread as they seem in written tales. It seems continuous and widespread because it is in the record, but it was more likely irregular in both time and location. Furthermore, as we know from our own experience, the persistence of the normal is typically larger than the impact of the disruption. 

        After digesting today's news, one expects to be confronted with a society dominated by strikes, crimes, power outages, broken water mains, delayed trains, school closings, muggers, drug addicts, neo-Nazis, and rapists. 

        On a fortunate day, one may get home in the evening without having seen more than one or two of these occurrences. As a result, I developed Tuchman's Law, which states that "the fact of being reported increases the seeming magnitude of any terrible development by five to tenfold". 

        ~ Barbara W. Tuchman 


        In other words, people prefer to read about spectacular and overblown occurrences, thus events are portrayed as widespread and widespread. 


        • In history and the news, the negative elements of events are often highlighted, while chroniclers frequently overlook the good sides of significant occurrences.
        • Startup failures, for example, are widely publicized, while the achievements of the influential few are seldom covered or recorded in tiny type. 

        Many other cognitive bias situations, such as groupthink, fear of authority, lack of creativity, and hyper-rationality, have also been examined by psychologists. 



        • William Whyte coined the word "groupthink," and Irving Janis subsequently created the idea of groupthink to explain poor decision-making that may occur in groups as a consequence of factors that bring them together. 
        • The extreme fear of authority is classified as a type of social phobia by mental health professionals. 
        • The biggest adversary of truth, according to Albert Einstein, is blind obedience to authority. 
        • When something appears apparent to those in the know, foreseeable (especially in retrospect), and yet no preparation is made for the unfavorable result, it is called a failure of imagination. 
        • There is a lack of imagination if the person lacks the capacity or refuses to pull elements from past experiences and put them together to create an imagined scenario. 
        • A lack of constructive-episodic simulation has been linked to old age and the use of other kinds of memory recall. 


        Hyper-rationality is a defensive mechanism against anything that is dangerous or unsettling. It depicts circumstances in which reason has been pushed beyond its logical boundaries. 

        Artificial Intelligence (AI) on the other hand, offers a distinct perspective on independence: rational, transparent, ever-changing, relentless, and dispassionate. 


        • When applied to real-world circumstances, a range of methods using properly designed self-learning and AI technologies reduce or overcome cognitive biases. 
        • Depending on human-driven ethical norms, technology may bring both benefit and damage. 
        • If controlled by ethical norms and laws, AI stands a high possibility of becoming a basis for technology that overcomes human weakness. 
        • Biases of various kinds will have varying consequences, but they will always be detrimental. 
        • Fear of judgement, fear of failure, fear of the unknown, and fear of the irrational are the results of the four biases. 
        • This leads to people leaving, hiding, delaying, and freezing, none of which are desirable results for businesses or individuals. 


        This seems to be the most frequent observation of contemporary management, particularly with the focus on conflict of interest and fiduciary responsibilities. 


        Without any technical knowledge on the board or in management, it is virtually a given conclusion that the "correct" thing to do is to do nothing. However, AI has a lot to offer.


        ~ Jai Krishna Ponnappan


        You may also want to read more about Artificial Intelligence here.




        Digital To Quantum Computers At A Breakneck Speed



        Every year, the quantity of data created throughout the globe doubles. As data and its collection and transport transcend beyond stationary computers, as many gigabytes, terabytes, petabytes, and exabytes are created, processed, and gathered in 2018 as in all of human history previously to 2018. 

        Smart Phones, Smart Homes, Smart Clothes, Smart Factories, Smart Cities... the Internet is connecting numerous "smart" objects. And they're generating a growing amount of their own data. 

        • As a result, the demand for computer chip performance is increasing at an exponential rate. 
        • In fact, during the previous 50 years, their computational capacity has about quadrupled every 18 months. 
        • The number of components per unit space on integrated circuits grows in accordance with a law proposed in 1965 by Gordon Moore, Intel's future co-founder. 
        • The reason that the overall volume of data is growing faster than individual computer performance is due to the fact that the number of data-producing devices is growing at the same rate.


        Concerns that "Moore's Law" will lose its validity at some time date back 25 years. The reason for this is because component miniaturization is causing issues: 


        • As electrons move through progressively smaller and more numerous circuits, the chips get more hot. But there's a bigger issue: electronic structures have shrunk to fewer than 10 nanometers in size. This is around 40 atoms. 
        • The principles of quantum physics rule in transistors this small, rendering electron behavior completely unpredictable. Moore himself forecast the conclusion of his legislation in 2007, giving it another 10 to 15 years. 
        • Indeed, for the first time ever, the semiconductor industry's 2016 plan for chip development for the next year did not follow Moore's law. 
        • However, thanks to nano-engineers' ingenuity, it is conceivable that even smaller and quicker electronic structures will be achievable in the future, delaying the end of “classical” shrinking for a few more years. But then what? 

        How long can we depend on the ability to simply increase the performance of computer chips? 

        The fact that Moore's Law will no longer be true does not indicate that we have reached the end of the road in terms of improving information processing efficiency. 


        However, there is a technique to make computers that are significantly quicker, even billions of times more powerful: quantum computers. 

        • These computers operate in a very different manner than traditional computers. 
        • Rather than ignoring electron quantum qualities and the challenges associated with ever-increasing component downsizing, a quantum computer overtly uses these qualities in how it processes data. 
        • We might tackle issues that are much too complicated for today's "supercomputers" in physics, biology, weather research, and other fields with the aid of such devices. 
        • The development of quantum computers might spark a technological revolution that will dominate the twenty-first century in the same way that digital circuits dominated the twentieth. 
        • Quantum computers are expected to offer computation speeds that are unimaginable today.

        ~ Jai Krishna Ponnappan

        You may also want to read more about Quantum Computing here.


        Quantum Computing And Digital Evolution



        The Computer of Today is based on a concept from the 1940s. Although the shrinking of computer chips has prompted computer developers to study quantum mechanical rules, today's computers still operate purely on classical physics principles. 



        • Tubes and capacitors were used in the earliest computers in the 1940s, and the transistor, which was initially a "classical" component, is still a vital component in any computer today. 
        • The term "transistor" stands for "transfer resistor," which simply indicates that an electrical resistance is controlled by a voltage or current. 
        • The first transistor patent was submitted in 1925. Shortly after, in the 1930s, it was discovered that basic arithmetical operations may be performed by carefully controlling the electric current (for example, in diodes). 
        • The lack of computation speed and energy consumption are the two primary reasons why point contact transistors, triodes, and diodes based on electron tubes are only seen in technological museums today. 
        • Although the components have evolved, the architecture developed by Hungarian mathematician and scientist John von Neumann in 1945 remains the foundation for today's computers. 
        • The memory card, which carries both program instructions and (temporarily) the data to be processed, is at the heart of von Neumann's computer reference model. 
        • A control unit manages the data processing sequentially, that is, step by step, in single binary computing steps. A “SISD architecture” is a term used by computer scientists (Single Instruction, Single Data ). 

        Despite the fact that transistors and electron tubes have been replaced with smaller, faster field effect transistors on semiconductor chips, the architecture of today's computers has remained same since its inception. 


        How does sequential information processing in computers work? 


        Alan Turing, a British mathematician, theoretically outlined the fundamental data units and their processing in 1936. 

        The binary digital units, or "bits," are the most basic information units in the system. Because a bit may assume either the state "1" or the state "0," similar to a light switch that may be turned on or off, binary implies "two-valued." 

        • The word "digital" comes from the Latin digitus, which means "finger," and refers to a time when people counted with their fingers. 
        • Today, "digital" refers to information that may be represented by numbers. 
        • In today's computers, electronic data processing entails turning incoming data in the form of many consecutively organized bits into an output that is also in the form of many consecutively ordered bits. 
        • Blocks of individual bits are processed one after the other, much like chocolate bars on an assembly line; for a letter, for example, a block of eight bits, referred to as a "byte," is needed. 
        • There are just two processing options for single bits: a 0 (or 1) stays a 0 (or 1), or a 0 (or 1) transforms to a 1. (or 0). 
        • The fundamental electrical components of digital computers, known as logic gates1, are always the same fundamental fundamental electronic circuits, embodied by physical components such as transistors, through which information is transferred as electric impulses. 
        • The connection of many similar gates allows for more sophisticated processes, such as the addition of two integers. 

        Every computer today is a Turing machine: it does nothing but process information encoded in zeros and ones in a sequential manner, changing it into an output encoded in zeros and ones as well. 


        • However, this ease of data processing comes at a cost: to manage the quantity of data necessary in today's complicated computer systems, a large number of zeros and ones must be handled. 
        • The amount of accessible computational blocks improves the processing capacity of a computer in a linear fashion. A chip with twice as many circuits can process data twice as quickly. 
        • The speed of today's computer chips is measured in gigahertz, or billions of operations per second. This necessitates the use of billions of transistors. 
        • The circuitry must be tiny to fit this many transistors on chips the size of a thumb nail. Only thus can such fast-switching systems' total size and energy consumption be kept under control. 
        • The move from the electron tube to semiconductor-based bipolar or field effect transistors, which were created in 1947, was critical for the shrinking of fundamental computing units on integrated circuits in microchips. 
        • Doped semiconductor layers are used to construct these nanoscale transistors. 


        This is where quantum mechanics enters the picture. 

        • We need a quantum mechanical model for the migration of the electrons within these semiconductors to comprehend and regulate what's going on. 
        • This is the so-called "band model" of electronic energy levels in metallic conductors. 

        Understanding quantum physics was not required for the digital revolution of the twentieth century, but it was a need for the extreme downsizing of integrated circuits.


        ~ Jai Krishna Ponnappan

        You may also want to read more about Quantum Computing here.


        Analog Space Missions: Earth-Bound Training for Cosmic Exploration

        What are Analog Space Missions? Analog space missions are a unique approach to space exploration, involving the simulation of extraterrestri...