Juno, NASA's Spacecraft, Takes A Close Look At Jupiter's Moon Ganymede

 


From the left to the right: The mosaic and geologic maps of Ganymede, Jupiter's moon, were created using the finest available photos from NASA's Voyager 1 and 2 spacecraft, as well as NASA's Galileo spacecraft. 

Credit: USGS Astrogeology Science Center/Wheaton/NASA/JPL-Caltech/USGS Astrogeology Science Center/Wheaton/NASA/JPL-Caltech 


After more than 20 years, the first of the gas-giant orbiter's back-to-back flybys will deliver a close encounter with the gigantic moon. 

NASA's Juno spacecraft will pass within 645 miles (1,038 kilometers) of Jupiter's biggest moon, Ganymede, on Monday, June 7 at 1:35 p.m. EDT (10:35 a.m. PDT). Since NASA's Galileo spacecraft made its last near approach to the solar system's largest natural satellite on May 20, 2000, the flyby will be the closest a spacecraft has gotten near the solar system's greatest natural satellite. 


The solar-powered spacecraft's flyby will provide insights about the moon's composition, ionosphere, magnetosphere, and ice shell, in addition to stunning photographs. Future missions to the Jovian system will benefit from Juno's studies of the radiation environment around the moon. 

Ganymede is the only moon in the solar system with its own magnetosphere, a bubble-shaped area of charged particles around the celestial body that is larger than Mercury. “Juno contains a suite of sensitive equipment capable of observing Ganymede in ways never previously possible,” stated Southwest Research Institute in San Antonio Principal Investigator Scott Bolton. 

“By flying so close, we will bring Ganymede exploration into the twenty-first century, complementing future missions with our unique sensors and assisting in the preparation of the next generation of missions to the Jovian system, including NASA's Europa Clipper and ESA's Jupiter ICy moons Explorer [JUICE] mission.” 


About three hours before the spacecraft's closest approach, Juno's science equipment will begin gathering data. Juno's Microwave Radiometer (MWR) will gaze through Ganymede's water-ice crust, gathering data on its composition and temperature, alongside the Ultraviolet Spectrograph (UVS) and Jovian Infrared Auroral Mapper (JIRAM) sensors. 




A spinning Ganymede globe with a geologic chart placed over a global color mosaic is animated. Credit: USGS Astrogeology Science Center/Wheaton/ASU/NASA/JPL-Caltech/USGS Astrogeology Science Center/Wheaton/ASU/NASA/JPL-Caltech 


“The ice shell of Ganymede contains some light and dark parts, implying that certain parts may be pure ice while others include filthy ice,” Bolton explained. 


“MWR will conduct the first comprehensive study of how ice composition and structure change with depth, leading to a deeper understanding of how the ice shell originates and the mechanisms that resurface the ice over time.” 

The findings will be used to supplement those from ESA's upcoming JUICE mission, which will study ice using radar at various wavelengths when it launches in 2032 to become the first spacecraft to circle a moon other than Earth's Moon. 


Juno's X-band and Ka-band radio frequencies will be utilized in a radio occultation experiment to study the moon's fragile ionosphere (the outer layer of an atmosphere where gases are excited by solar radiation to form ions, which have an electrical charge). 

“As Juno travels behind Ganymede, radio signals will travel over Ganymede's ionosphere, generating modest variations in frequency that should be picked up by two antennas at the Deep Space Network's Canberra complex in Australia,” said Dustin Buccino, a Juno mission signal analysis engineer at JPL. “We might be able to grasp the relationship between Ganymede's ionosphere, its intrinsic magnetic field, and Jupiter's magnetosphere if we can monitor this change.” 


With NASA's interactive Eyes on the Solar System, you can see where Juno is right now. 

The Juno spacecraft is a dynamic technical wonder, with three huge blades reaching out 66 feet (20 meters) from its cylindrical, six-sided body, spinning to keep itself steady as it executes oval-shaped orbits around Jupiter. 


Juno's Stellar Reference Unit (SRU) navigation camera is normally responsible for keeping the Jupiter spacecraft on track, but it will perform double duty during the flyby. 


Along with its navigational functions, the camera will collect information on the high-energy radiation environment in the region surrounding Ganymede by capturing a particular collection of photos. 

The camera is adequately insulated against radiation that may otherwise harm it. “In Jupiter's harsh radiation environment, the traces from penetrating high-energy particles appear in the photos as dots, squiggles, and streaks — like static on a television screen. 

According to Heidi Becker, Juno's radiation monitoring lead at JPL, "we extract these radiation-induced noise patterns from SRU photos to obtain diagnostic pictures of the radiation levels encountered by Juno." 


Meanwhile, the Advanced Stellar Compass camera, developed by the Technical University of Denmark, will count very intense electrons that pass through its shielding at a quarter-second interval. The JunoCam imager has also been enlisted. 


The camera was designed to transmit the thrill and beauty of Jupiter exploration to the public, but it has also given a wealth of essential research throughout the mission's almost five-year stay there. JunoCam will capture photographs at a resolution comparable to the best from Voyager and Galileo for the Ganymede flyby. 

The Juno research team will examine the photographs and compare them to those taken by earlier missions, seeking for changes in surface characteristics that may have happened over four decades or more. 

Any changes in the pattern of craters on the surface might aid astronomers in better understanding the present population of objects that collide with moons in the outer solar system. 


Due to the speed of the flyby, the frozen moon will change from a point of light to a visible disk and back to a point of light in roughly 25 minutes from JunoCam's perspective. 


There's just enough time for five photographs in that amount of time. “Things move quickly in the area of flybys, and we have two back-to-back flybys coming up next week. As a result, every second counts,” stated Juno Mission Manager Matt Johnson of the Jet Propulsion Laboratory. 

“On Monday, we'll fly through Ganymede at about 12 miles per second (19 kilometers per second). We're making our 33rd scientific flyby of Jupiter in less than 24 hours, swooping low over the cloud tops at around 36 miles per second (58 kilometers per second). It's going to be a roller coaster.” even more Concerning the Mission. 

The Juno mission is managed by JPL, a subsidiary of Caltech in Pasadena, California, for the principle investigator, Scott J. Bolton of the Southwest Research Institute in San Antonio. Juno is part of NASA's New Frontiers Program, which is administered for the agency's Science Mission Directorate in Washington by NASA's Marshall Space Flight Center in Huntsville, Alabama. 


The spacecraft was manufactured and is operated by Lockheed Martin Space in Denver. 


courtesy www.nasa.com

Posted by Jai Krishna Ponnappan


More data on Juno may be found at,


https://www.nasa.gov/juno for further details.

https://www.missionjuno.swri.edu


Follow the mission on social media at 

https://www.facebook.com/NASASolarSystem 

and on Twitter at https://twitter.com/NASASolarSystem 






Nanomachine Manufacture - A World Made of Dust—Nano Assemblers



Let us consider Feynman's ultimate vision: machines that can manufacture any substance from atomic components in the same way that children construct buildings out of Lego bricks. 

In a form of atomic 3D printer, a handful of soil includes all the essential atoms to allow such "assemblers" to construct what we want seemingly out of nowhere. 


  • The term "nano-3D" may become a new tech buzzword in the near future. These devices would not be completely new! They've been around for 1.5 billion years on our planet. 
  • Nanomachines manufacture proteins, cell walls, nerve fibers, muscle fibers, and even bone molecule by molecule in our body's two hundred distinct cell types using specific building blocks (sugar molecules, amino acids, lipids, trace elements, vitamins, and so on). 
  • Here, very specialized proteins play a key function. The enzymes are the ones you're looking for. The energy required for these activities comes from the food we consume. 
  • Biological nanomachines carry, create, and process everything we need to exist in numerous metabolic processes, like a small assembly line. 
  • Nature's innovation of cell metabolism in living systems demonstrated that assemblers are conceivable a long time ago. Enzymes are the genuine masters of efficiency as nanomachines. 

What is preventing us, as humans, from producing such technologies? 


We can even take it a step further: if nanomachines can accomplish anything, why couldn't they construct themselves? 


  • Nature has also demonstrated this on the nanoscale: DNA and RNA are nothing more than extremely efficient, self-replicating nanomachines. 
  • Man-made nanomachines may not be as far away from self-replication as they appear. 
  • Nature has long addressed the difficulty of nanomachine self-replication: DNA may be thought of as a self-replicating nanomachine. 
  • Nanotechnology opens up a world of possibilities for us to enhance our lives. Nonetheless, most people are put off by the word "nano," as are the phrases "gene" and "atomic," which similarly relate to the incomprehensibly small. 
  • Nanoparticles, genes, and atoms are all invisible to the naked eye, yet the technologies that rely on them are increasingly influencing our daily lives. 


What happens, though, when artificial nanomachines have their own momentum and are able to proliferate inexorably and exponentially? What if nanomaterials turn out to be toxic? 


The first of these issues has already arisen: nanoparticles used in a variety of items, such as cosmetics, can collect in unexpected areas, such as the human lung or in marine fish. 


What impact do they have in that area? 

Which compounds have chemical reactions with them and can attach to their extremely active surfaces? 


  • According to several research, certain nanoparticles are hazardous to microorganisms. 
  • To properly analyze not just the potential, but also the impacts of nanotechnologies, more information and education are necessary. 

This is especially true of the quantum computer.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Nano: Infinite Possibilities On The Invisible Small Scale



Norio Taniguchi was the first to define the word "nanotechnology" in 1974: Nanotechnology is primarily concerned with the separation, consolidation, and deformation of materials by a single atom or molecule. 


  • The word "nano" refers to particle and material qualities that are one nanometer to 100 nanometers in size (1 nm is one millionth of a millimeter). 
  • The DNA double helix has a diameter of 1.8 nm, while a soot particle is 100 nm in size, almost 2,000 times smaller than the full stop at the end of this sentence. 
  • The nanocosm's structures are therefore substantially smaller than visible light wavelengths (about 380–780 nm). 

The nano range is distinguished by three characteristics: 


It is the boundary between quantum physics, which applies to atoms and molecules, and classical rules, which apply to the macro scale. Scientists and engineers can harness quantum phenomena to develop materials with unique features in this intermediate realm. This includes the tunnel effect, which, as indicated in the first chapter, is significant in current transistors. 

When nanoparticles are coupled with other substances, they aggregate a huge number of additional particles around them, which is ideal for scratch-resistant car paints, for example. 

Because surface atoms are more easily pulled away from atomic complexes, nanoparticles function as catalysts for chemical processes when a fracture occurs in the material. This is demonstrated via a simple geometric consideration. A cube with a side of one nanometre (approximately four atoms) includes on average 64 atoms, 56 of which are situated on the surface (87.5 percent). In comparison to bulk atoms, the bigger the particle, the fewer surface atoms accessible for reactions. Only 7.3 percent of the atoms in a nanocube with a side of 20 nm (containing 512,000 atoms) are on the surface. Their percentage declines to 1.2 percent at 100 nm.


Nanoparticles are virtually totally made up of surface, making them extremely reactive and endowing them with surprising mechanical, electrical, optical, and magnetic capabilities. 


Physicists have known for a long time that this is true in (quantum) theory. However, the technologies required to isolate and treat materials at the nanoscale have not always been available. 

  • The invention of the Scanning Tunneling Microscope (STM) by Gert Binning and Heinrich Rohrer in 1981 was a watershed moment in nanotechnology (for which they were awarded the 1986 Nobel Prize in Physics). Single atoms can be seen with this gadget. The electric current between the tip of the grid and the electrically conductive sample reacts extremely sensitively to changes in their spacing as little as one tenth of a nanometer due to a particular quantum phenomena (the tunneling effect). 
  • In 1990, Donald Eigler and Erhard Schweizer succeeded in transferring individual atoms from point A to point B by altering the voltage provided to the STM grid tip; the device could now not only view but also move individual atoms. With 35 xenon atoms written on a nickel crystal, the two researchers “wrote” the IBM logo. Researchers were able to construct a one-bit memory cell using just 12 atoms twenty-two years later (normal one-bit memory cells still comprise hundreds of thousands of atoms). 

What Feynman envisioned as a vision of the future in 1959, namely the atom-by-atom production of exceedingly small goods, is now a reality. 

Physicists and engineers are using quantum physics to not only manipulate atoms and create microscopic components, but also to produce new materials (and better comprehend existing ones).


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.


Nanomaterials - Materials Of Wonder.



Skilled blacksmiths have been producing the renowned Damascus steel in a complex manufacturing process for for 2,000 years. Layers of various steels are piled, forged together, continuously folded over and flattened until a substance consisting of up to several hundred of these layers is eventually created, similar to how a baker kneads dough. 

Damascus steel is highly hard while also being incredibly flexible when compared to regular steel. It is now recognized that the incorporation of carbon nanotubes with lengths of up to 50 nm and diameters of 10 to 20 nm is responsible for these exceptional material characteristics. 


Of course, ancient and medieval blacksmiths had no knowledge of nanotubes because their procedures were totally dependent on trial and error. 


As further examples, humans were already producing gleaming metallic nanoparticle surfaces on ceramics 3,400 years ago in Mesopotamia and Egypt, while the Romans used nanoparticles to seal their everyday ceramics, and red stained glass windows were made with glass containing gold nanoparticles in the Middle Ages. 


  • Nanoparticle-based materials have been made and utilized since the beginning of time. We can now comprehend and even enhance materials like Damascus steel thanks to quantum physics' insight.
  • Millennia-old forging methods can be further enhanced by carefully specifying the inclusion of particular materials. 
  • Nanometer-sized nickel, titanium, molybdenum, or manganese particles can be introduced into the iron crystal lattice of steel for this purpose. Nickel and manganese, in particular, encourage the development of nanocrystals, which maintain their structure even when the metal is bent, ensuring the material's resilience. 
  • Due to the precise dispersion of these nanocrystals, the steel becomes very flexible and bendable. Despite accounting for a relatively tiny portion of the overall mass, the extra particles provide far better characteristics than the pure iron crystal lattice. This strategy is employed, for example, in the automobile and aerospace industries, where more deformable and robust steels enable lightweight materials and energy-saving building processes.
  • The notion of introducing super-fine distributions of nanoparticles into materials (known as "doping" in semiconductors) underpins a variety of nanomaterial manufacturing processes. 


The “seasoning” of materials with single atoms or nano-atomic compounds can give them completely new properties, allowing us to make: 


• foils that conduct electricity, 

• semiconductors with precisely controlled characteristics (which have been the foundation of computer technology for decades), and 

• creams that filter out UV components from sunlight. Nanotechnology can also be used to replicate goods that have evolved naturally. 


 

Spider silk is a fine thread that is just a few thousandths of a millimetre thick yet is very ductile, heat-resistant up to 200 degrees, and five times stronger than steel. For decades, scientists have wished to create such a chemical in the lab. This dream has now become a reality. 


  • A mixture of chain shaped proteins and small fragments of carbohydrate with lengths in the nanometer range is the secret of natural spider's thread. 
  • Artificial spider silk may be utilized to make super-textiles that help troops wear blast-resistant gear, athletes wear super-elastic clothes, and breast implant encasements avoid unpleasant scarring. 

Nanomaterials were created and exploited by evolution long before humanity did. We can reconstruct and even enhance these now thanks to quantum physics discoveries.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Nanomaterials - Diamonds Aren't The Only Thing That's Valuable.



Pure nanomaterials have become available today.


Graphite is a fascinating example. 


  • Graphite is a kind of elementary carbon that is commonly used to produce pencil leads. It's just a stack of carbon layers, each one the thickness of a single carbon atom. 
  • Each layer is made up of graphene, a two-dimensional carbon molecule lattice regulated by quantum physics. 
  • For many years, scientists have been researching these ultra-thin carbon layers theoretically. 
  • Their quantum-physical calculations and simulations revealed that graphene must have incredible properties: 200 times the strength of steel, outstanding electrical and thermal conductivity, and transparency to visible light. 
  • They merely needed verification that their theoretical calculations were true in practice. 
  • Andre Geim and Konstantin Novoselov then succeeded in isolating pure graphene in 2004. Their plan was to use a graphite-based adhesive tape to remove it. 
  • In 2010, Geim and Novoselov were awarded the Nobel Prize in Physics for their work. Has a Nobel Prize in Physics ever been granted for anything so simple? 


Graphene is the world's thinnest substance, with thicknesses on the order of one nanometer. 


  • At the same time, its atoms are held together by densely packed “covalent” chemical bonds, which bind them all. 
  • There are no flaws in this material, no areas where it may break, in a sense. 
  • Because each carbon atom in this composite may participate in chemical processes on both sides, it exhibits exceptional chemical, electrical, magnetic, optical, and biological capabilities. 


Graphene might be used in the following ways: 


• Clean drinking water production: graphene membranes may be utilized to construct extremely efficient desalination facilities. 

• Energy storage: Graphene may be utilized to store electrical energy more effectively and long-term than other materials, allowing for the creation of long-lasting and lightweight batteries. 

• Medicine: graphene-based prosthetic retinas are being studied by experts (see below). 

• Electronics: graphene is the world's tiniest transistor. 

• Special materials: graphene might potentially be utilized as a coating to create flexible touchscreens, allowing mobile phones to be worn like bracelets. 


The EU believes graphene-based technologies have such promising futures that it designated research in this subject as one of two initiatives in the Future and Emerging Technologies Flagship Initiative, each with a one-billion-euro budget. 

The Human Brain Project is the other sponsored project, but a third has emerged in the meantime: the flagship project on quantum technologies. 

Graphene, a nanomaterial, is thought to be a future wonder material.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Microelectronics To Nanoelectronics

 


Doped silicon crystals are the basis of modern microelectronics. We've been pursuing the path from micro to nanoelectronics for quite some time now. 

And some of Feynman's vision has already come to fruition. In 1959, he claimed that a particle of dust could contain the information of 25 million books. 

  • One bit must be held in 100 atoms to do this. It is now feasible to create elementary storage units with 12 atoms. So there's capacity for over 250 million books on a particle of dust. 
  • Carbon nanotubes, commonly known as nanotubes, are an example of future nanomaterials in electronics. 
  • Graphene layers have been rolled into tubes to create small carbon cylinders with a diameter of roughly 100 nanometers. 
  • Only the rules of quantum physics can explain their unique electrical characteristics. 
  • Because the electrons pass through the Nano tube almost without interference, i.e. without being deflected by blocking atoms as they would be in a metallic conductor, they carry electronic currents better than any copper conductor, depending on the diameter of the tube. 


Stanford University researchers have built a functioning computer with 178 nanotube transistors.  It possesses the processing capacity of a 1955 computer, which could occupy an entire gymnasium. Even farther, the nanomaterial "silicene" goes. 


  • Atoms are stacked in two-dimensional layers with honeycomb patterns, similar to graphene. But, unlike graphene, which is formed of carbon, silicene is a foil formed of elementary silicon, a semiconductor, which makes it particularly attractive for computer chip fabrication. 
  • The first transistor constructed of silicene was constructed in 2014 by researchers at the University of Texas. 

Despite the fact that silicene's manufacturing and processing are still technically challenging (it decays when exposed to oxygen, for example), there is high expectation that this substance can dramatically improve the performance of computer chips. 


  • Transistors made of nanotubes or silicon might be switched significantly quicker, resulting in significantly more powerful computer processors. 
  • The creation of nanotubes for use in computers, on the other hand, is not the end of the narrative.


Physicists and computer designers want to employ single molecules as transistors in the future. In reality, by flipping a switch, some organic molecules may be transformed from electrically conductive to insulating.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



When Biotechnology And Nanotechnology Collide



Richard Feynman foresaw sixty years ago that nanoparticles and nanomachines may be extremely useful in medicine. 


This aspect of his vision is also coming to fruition right now. Here are three examples of things that are already being done: 


Nano-Retina, an Israeli startup, has invented an artificial nano-retina that allows the blind to sight again. 4: it is made up of a small, flat implant with a high-resolution network of nano-electrodes. The nano-retina activates the optic nerve, causing incoming light particles to be collected by the electrodes and relayed to the brain as visual sensations. 

Nano biosensors detect antibodies and particular enzymes in human bodily fluids in a lab on a chip. On a credit card-sized chip, just one-thousandth of a millilitre of blood, urine, or saliva (or even less) is put. When it comes into touch with the desired substance, the nanoparticles embedded in it detect certain chemical, optical, or mechanical changes. As a result, the chip can identify a variety of medical signs in only a few minutes. 

Nanoparticles deliver medications directly to locations of inflammation or mutant cells, allowing for a more effective pharmacological assault. Because blood is as sticky as honey for such small particles, the topic of how to transport nanostructures in the blood has remained unanswered for a long time. Magnetic fields, for example, may now be used to direct them. Bioengineers want to utilize them in precision chemotherapies against cancer cells, among other things. 


Nano-robots, often known as "nanobots," are extremely small nano-robots that hold great promise in medicine. Every two years, we'd go to the doctor for a health checkup, which would be replaced with a continuous nano-check. 


  • Nanobots would roam our bodies indefinitely, detecting viruses, gene changes, and harmful deposits in the circulation before they became a problem. 
  • They would then start treatment right away by administering medications straight to the illness location. 
  • They'd combat infections, reduce inflammation, remove cysts and cellular adhesions, unblock clogged arteries to avoid strokes, and even do surgery. 
  • They would submit the results immediately to the family doctor if required, who would then contact the patient to schedule an appointment. 
  • Many small nano-robots—biomarkers, labs-on-a-chip, and other telemedical devices—permanently circulate inside our bodies for health care and healing, according to doctors. Nanoparticles, also known as nanobots, might be employed in our food. 
  • They would assist us in digesting food in such a way that nutrients are absorbed as efficiently as possible by our bodies. This would be beneficial in the treatment of disorders that now necessitate a tight diet. 


Researchers are also working on developing meals with nanoparticles on the surface that would mimic the flavor of chips, chocolates, or gummy bears while being nutritious and even healthful.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Ultra-Small Nano Machines - Masters Of The Nano-World



Our growing technical mastery of the nanoworld will open up a plethora of new technical possibilities, including Feynman's vision of ultra-small machines operating at the level of single atoms. 

  • Nanowheels, nanomotors, and even a nano-elevator have previously been constructed.
  • There is a nano-car with four distinct motors installed on a central support, powered by the tip of a scanning tunneling microscope. 

Nanotechnologists can make things even smaller. 


  • A single bent thioether molecule lying on a copper surface makes up the world's tiniest electric motor, which is only a nanometre in size. 
  • Two differing length hydrocarbon chains (a butyl and a methyl group) hang like small arms on a central sulphur atom in this molecule. 
  • The whole molecule is connected to the copper surface in a way that allows it to freely spin. It is powered by a scanning tunneling microscope, whose electrons use the tunnel effect to excite the molecule's rotating degrees of freedom. 
  • The electrical current and the outside temperature can affect the motor's operating speed.  Nanomachines are currently being developed. 
  • The molecular motor is on par with the electric motor in the 1830s in terms of progress. Nobody could have predicted that the electric motor would one day be used to power trains, dishwashers, and vacuum cleaners in 1830. 


When voting on the 2016 Nobel Prize for Chemistry, the Nobel Prize Committee in Stockholm foresaw a comparable promise for molecular nanomachines. 

Molecular motors are anticipated to be employed in sensors, energy storage systems, and the production of novel materials in the near future. 


Nanotechnology has progressed in a number of ways that have mostly gone unnoticed by the general public: 


• The first generation of nanotechnology products, such as Damascus steel, were still passive materials with well-defined properties that did not change when used. 

• The second generation of nanotechnology products, on the other hand, produced tiny machines that “do work”—in other words, they drive an active process, such as a transport vehicle for targeted drug delivery in the body (see below). Nanostructures now interact and react directly with other substances, causing them to change and/or their surroundings. 

• A third generation of nanotechnologies, known as "integrated nano-systems," is already on the horizon. Various active nano-components, such as copiers, sensors, motors, transistors, and so on, are employed as components and built into a working whole, similar to how an engine, clutch, electronics, tires, and so on, when combined, become a car. This paves the door for more complicated nanomachines to emerge.

 

Couple nanostructures with varied characteristics and capacities into sophisticated nanomachines is the next stage in nanotechnology.


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Nanotechnology's Possibilities: Technology On The Smallest Scales



We currently employ nanotechnology in a variety of ways, but only a small percentage of the population is aware of it. Nanotechnology, in addition to the quantum computer, is the most interesting prospective technological application of quantum theory. 


Many of its uses are now part of our daily routines. Some examples include: 


• Sun cream lotions that use nanotechnology to give UV protection. 

• Nanotechnologically treated surfaces for self-cleaning window panes, scratch-resistant automobile paint, and ketchup that pours evenly from the bottle. 

• Textiles coated with nanoparticles to reduce perspiration odor. Antibacterial silver particles, for example, keep bacteria from turning our odorless perspiration into a foul-smelling body odor. 


The upcoming nanotechnologies are even more amazing. 

Nano-robots that automatically and permanently detect diseases in human bodies, as well as autonomous nanomachines that can generate almost anything from a mound of soil. 

Nanotechnology has long been ingrained in our daily lives, but this technological outgrowth of quantum physics has a brighter future. 

One can get the notion that “nano” is the key to everything fascinating and futuristic. 


~ Jai Krishna Ponnappan


You May Also Want To Read More About Nano Technology here.



Potential of Quantum Computing Applications



Despite the threat that the existence of a large-scale quantum computer (an FTQC) poses to information security, the ability of intermediate-scale (NISQ) processors to provide unprecedented computing power in the near future opens up a wide opportunity space, especially for critical Defense Department applications and the Defense technology edge. 

The current availability of NISQ processors has drastically changed the development route for quantum applications. 

As a result, a heuristics-driven strategy has been developed, allowing for significantly greater engagement and industry involvement. 

Previously, quantum algorithm research was mostly focused on a far-off FTQC future, and determining the value of a quantum application needed extremely specialized mathematical abilities. 

We believe that in the not-too-distant future, this will no longer be essential for quantum advantage to be practicable. 

As a result, it will be critical, particularly the Defense Department and other agencies, to have access to NISQ devices, which we anticipate will enable for the development of early mission-oriented applications. 

While NISQ processors do not pose a danger to communications security in and of itself, this recently obtained intermediate regime permits quantum hardware and software development to be merged under the ‘quantum advantage' regime for the first time, potentially speeding up progress. 


This emphasizes the security apparatus's requirement for a self-contained NISQ capability.




Quantum Computing Threat to Information Security



Current RSA public-key (asymmetric) encryption systems and other versions rely on trapdoor mathematical functions, which make it simple to compute a public key from a private key but computationally impossible to compute the converse, a private key from a public key.

The difficulties of integer factorization and elliptic curve variations of the discrete logarithm issue, both of which have no known solution for computing an inverse in polynomial time, are exploited to create frequently used trapdoor functions (that is, on a finite timescale). 


In a nutshell, this so-called "computational hardness" provides safety. 


In 1994, however, Peter Shor proposed a quantum method that may be employed on a sufficiently large-scale quantum computer to perform integer factorization in polynomial time. 

The now-famous quantum technique has now been proved to solve the discrete logarithm and elliptic-curve logarithm problems in polynomial time as well. 


As a result of the creation of an FTQC in conjunction with this quantum algorithm, the security of present asymmetric public-key cryptography is jeopardized. 

Furthermore, Shor's method exemplifies how advances in the mathematics and physical sciences have the potential to jeopardize secure communications in general. 


In addition to Defense Department and critical cyber infrastructure systems, the world's digital revolution, which includes 4 billion internet users, 2 billion websites, and over $3 trillion in retail transactions, is backed at multiple tiers by existing public-key cryptography. 


While the creation of an FTQC is estimated to be at least a decade or two away, there is still a pressing need to solve this issue because of the ‘record now, exploit later' danger, in which encrypted data is collected and kept for subsequent decryption by an FTQC when one becomes available. 

As a result, the US National Institute of Standards and Technology's Post Quantum Cryptography Project, which includes worldwide partners—a security "patch" for the internet—is prioritizing the development of new "quantum hard" public-key algorithms.




Quantum Computing - A New Way to Compute

 


Google formally debuted their newly created Sycamore quantum processor in 2019 and claimed to have completed the first computation that was simple for a quantum computer but extremely challenging for even the most powerful supercomputers. 

Previously, continuous breakthroughs in transistor fabrication technology had propelled the world's ever-increasing computer capability. Computing power has increased dramatically during the last 50 years. 


Despite these significant technical advancements, the underlying mathematical laws that govern computers have remained basically constant. 

Google's demonstration of so-called "quantum supremacy," also known as "quantum advantage," was based on 30 years of advancements in mathematics, computer science, physics, and engineering, and it heralded the start of a new era that might cause considerable upheaval in the technology landscape. 

Traditional (‘classical') computers work with data encoded in bits, which are often represented by the presence (or absence) of a little electrical current. 


According to computational complexity theory, this option leads to issues that will always be too expensive for traditional computers to solve. Simply put, the traditional cost of modelling complicated physical or chemical systems doubles with each extra particle added. 

In the early 1980s, American Nobel Laureate Richard Feynman proposed quantum computers as a solution to avoid this exponential expense. 


Information is encoded in quantum mechanical components called qubits, and quantum computers manage this information. 

Qubits are encoded by superconducting electrical currents that may be modified by precisely engineered electrical componentry in Google's Sycamore processor, for example. 

The ‘factoring problem,' in which a computer is entrusted with identifying the prime factors of a big number, remained an academic curiosity until quantum computers were shown to be capable of solving it effectively. 


The RSA public-key cryptosystem, which is a cornerstone of internet security, is based on this key issue. 

With that finding, a flurry of research activity erupted throughout the world to see if quantum computers could be developed and if so, how powerful they could be.




Post Quantum Computing Encryption - Future-Proofing Encryption



Encryption in the post-quantum era. 


Many popular media depictions of quantum computing claim that the creation of dependable large-scale quantum computers will bring cryptography to an end and that quantum computers are just around the corner. 

The latter point of view may turn out to be overly optimistic or pessimistic, if you happen to rely on quantum-computing-proof security. 

While quantum computers have made significant progress in recent years, there's no certainty that they'll ever advance beyond laboratory proof-of-concept devices to become a realistic daily technology. (For a more thorough explanation, see a recent ASPI study.) 


Nonetheless, if quantum computing becomes a viable technology, several of the most extensively used encryption systems would be vulnerable to quantum computer cryptography assaults because quantum algorithms may drastically shorten the time it takes to crack them. 


For example, the RSA encryption scheme for the secure exchange of encryption keys, which underlies most web-based commerce, is based on the practical difficulty of finding prime factors of very big integers using classical (non-quantum) computers.

However, there is an extremely efficient quantum technique for prime factorization (known as ‘Shor's algorithm') that would make RSA encryption vulnerable to attack, jeopardizing the security of the vast quantity of economic activity that relies on the ability to safeguard moving data. 

Other commonly used encryption protocols, such as the Digital Signature Algorithm (DSA) and Elliptic Curve DSA, rely on mathematical procedures that are difficult to reverse conventionally but may be vulnerable to quantum computing assaults. 


Moving to secure quantum communication channels is one technique to secure communications. 


However, while point-to-point quantum channels are conceivable (and immune to quantum computer assaults), they have large administration overheads, and constructing a quantum ‘web' configuration is challenging. 

A traditional approach is likely to be favored for some time to come for applications such as networking military force units, creating secure communications between intelligence agencies, and putting up a secure wide-area network. 


Non-quantum (classical) techniques to data security, fortunately, are expected to remain safe even in the face of quantum computer threats. 


Quantum assaults have been found to be resistant to the 256-bit Advanced Encryption Standard (AES-256), which is routinely employed to safeguard sensitive information at rest. 

Protecting data at rest addresses only half of the problem; a secure mechanism for transferring encryption keys between the start and end locations for data in motion is still required. 


As a result, there's a lot of work being done to construct so-called "post-quantum" encryption systems that rely on mathematical processes for which no quantum algorithms exist. 


IBM has already detailed a quantum-resistant technology for safely transporting data across networks.  If the necessity arises, such a system might possibly replace RSA and other quantum-vulnerable encryption systems.



If everything else fails, there's always encryption technologies for the twenty-first century. 


One technique to improve communication security is to be able to ‘narrowcast' in such a way that eavesdropping is physically difficult, if not impossible. 

However, this is not always practicable, and there will always be messages that must pass over channels that are sensitive to eavesdropping. 


Even so-called "secure" channels can be breached at any time. 


The actual tapping of a subsea cable run to a Soviet naval facility on the Kamchatka Peninsula by the US Navy in the 1970s is a good example. The cable was deemed safe since it ran wholly within Russian territorial seas and was covered by underwater listening posts. 

As a result, it transmitted unencrypted messages. The gathered signals, though not of high intelligence value in and of themselves, gave cleartext ‘cribs' of Soviet naval communications that could be matched with encrypted data obtained elsewhere, substantially simplifying the cryptanalytic work. 

Even some of the LPI/LPD technology systems discussed in earlier sections may be subject to new techniques. 

For example, the Pentagon has funded research on devices that gather single photons reflected off air particles to identify laser signals from outside the beam, with the goal of extracting meaningful information about the beam direction, data speeds, and modulation type. The ultimate objective is to be able to intercept laser signals in the future.  


A prudent communications security approach is to expect that an opponent will find a method to access communications, notwithstanding best attempts to make it as difficult as possible. 


Highly sensitive information must be safeguarded from interception, and certain data must be kept safe for years, if not decades. Cryptographic procedures that render an intercepted transmission unintelligible are required. 

As we saw in the section on the PRC's capabilities, a significant amount of processing power is currently available to target Australian and ally military communications, and the situation is only going to become worse. 

On the horizon are technical dangers, the most well-known of which is the potential for effective quantum computing. Encryption needs to be ‘future proofed.'


As secure intermediates, space-based interconnections are used. 


If the connection can be made un-interceptable, space-based communications might provide a secure communication route for terrestrial organizations. Information and control signals between spacecraft and the Earth have been sent by radio waves to and from ground stations until now. 

Interception is achievable when collection systems are close enough to the uplink transmitter to collect energy from either the unavoidable side lobes of the main beam or when the collection system is able to be positioned inside the same downlink footprint as the receiver. 

The use of laser signals of various wavelengths to replace such RF lines has the potential to boost data speeds while also securing the communications against eavesdropping. 


Using laser communication connection between spacecraft has a number of advantages as well. 

Transmission losses over long distances restrict the efficiency with which spacecraft with low power budgets can exchange vast amounts of data, and RF connections inevitably restrict bandwidth. 


The imposts on space, weight, and power on spacecraft would be reduced if such linkages were replaced by laser communications. 

The benefits might include being able to carry larger sensor and processing payloads, spending more time on mission (owing to reduced downtime to recharge batteries), or a combination of the two. 

In the United States, the Trump administration's Space Force and anticipated NASA operations (including a presence on the moon and deep space missions) have sparked a slew of new space-based communications research initiatives. 


NASA has a ten-year project road map (dubbed the "decade of light") aiming at creating infrared and optical frequency laser communication systems, combining them with RF systems, and connecting many facilities and spacecraft into a reliable, damage-resistant network. 

As part of that effort, it is developing various technology demonstrations. 

Its Laser Communications Relay Demonstration, which is set to be live in June, will utilize lasers to encode and send data at speeds 10 to 100 times faster than radio systems.  

NASA uses the example of transmitting a map of Mars' surface back to Earth, which may take nine years with present radio technology but just nine weeks using laser communications. T

he practicality of laser communications has been demonstrated in laboratory prototype systems, and NASA plans to launch space-based versions later this year. The Pentagon's Space Development Agency (SDA) and the Defense Advanced Research Projects Agency (DARPA) are both working on comparable technologies, but with military and intelligence purposes in mind. 


The SDA envisions hundreds of satellites linked by infrared and optical laser communication connections. 

Sensor data will be sent between spacecraft until it reaches a satellite in touch with a ground station, according to the plan. Information from an orbiting sensor grid may therefore be sent to Earth in subsecond time frames, rather than the tens of minutes it can take for a low-Earth-orbiting satellite to pass within line of sight of a ground station. 

Furthermore, because to the narrow beams created by lasers, an eavesdropper has very limited chance of intercepting the message. Because of the increased communication efficiency, ‘traffic jams' in the considerably more extensively utilized radio spectrum are significantly less likely to occur. 

This year, the SDA plans to conduct a test with a small number of "cubesats." Moving to even higher frequencies, X-ray beams may theoretically transport very high data-rate messages. In terrestrial applications, ionization of air gases would soon attenuate signals, but this isn't an issue in space, and NASA is presently working on gigabit-per-second X-ray communication lines between spacecraft.  

Although NASA is primarily interested in applications for deep space missions (current methods can take many hours to transmit a single high-resolution photograph of a distant object such as an asteroid after a flyby), the technology has the potential to link future constellations of intelligence-gathering and communications satellites with extremely high data-rate channels. On board the International Space Station, NASA has placed a technology demonstration.



Communications with a low chance of being detected. 


One technique to keep communications safe from an enemy is to never send them over routes that can be detected or intercepted. For mobile force units, this isn't always practicable, but when it is, communications security may be quite effective. 

The German army curtailed its radio transmissions in the run-up to its Ardennes operation in December 1944, depending instead on couriers and landlines operating within the region it held (which was contiguous with Germany, so that command and control traffic could mostly be kept off the airwaves).

 The build-up of considerable German forces was overlooked by Allied intelligence, which had been lulled into complacency by having routinely forewarned of German moves via intercepted radio communications. 

Even today, when fibre-optic connections can transmit data at far greater rates than copper connections, the option to go "off air" when circumstances allow is still valuable. Of course, mobile troops will not always have the luxury of transferring all traffic onto cables, especially in high-speed scenarios, but there are still techniques to substantially minimize the footprint of communication signals and, in some cases, render them effectively undetectable. 


Frequency-hopping and spread-spectrum radios were two previous methods for making signals less visible to an eavesdropper. 


Although these approaches lower the RF footprint of transmissions, they are now vulnerable to detection, interception, and exploitation using wideband receivers and computer spectral analysis tools. Emerging technologies provide a variety of innovative approaches to achieve the same aim while improving security. 

The first is to use extremely directed ‘line of sight' signals that may be focused directly at the intended receiver, limiting an adversary's ability to even detect the broadcast. This might be accomplished, for example, by using tightly concentrated laser signals of various wavelengths that may be precisely directed at the desired recipient's antenna when geography allow. 


A space-based relay, in which two or more force components are linked by laser communication channels with a constellation of satellites, which are connected by secure links (see the following section for examples of ongoing work in that field), offers a difficult-to-intercept communications path. 


As a consequence, data might be sent with far less chance of being intercepted than RF signals. The distances between connecting parties are virtually unlimited for a satellite system with a worldwide footprint for its uplinks and downlinks. Moving radio signals to wavelengths that do not travel over long distances due to atmospheric absorption, but still give effective communications capabilities at small ranges, is a second strategy that is better suited to force elements in close proximity. 


The US Army, for example, is doing research on deep ultraviolet communications (UVC). 5 UVC has the following benefits over radio frequencies such as UHF and VHF: 


• the higher frequency enables for faster data transfer

• very low-powered signals can still be received over short distances

• signal strength rapidly drops off over a critical distance 


Communications with a low chance of being detected. One technique to keep communications safe from an enemy is to never send them over routes that can be detected or intercepted. 


For mobile force units, this isn't always practicable, but when it is, communications security may be quite effective. The German army curtailed its radio transmissions in the run-up to its Ardennes operation in December 1944, depending instead on couriers and landlines operating within the region it held (which was contiguous with Germany, so that command and control traffic could mostly be kept off the airwaves). 

The build-up of considerable German forces was overlooked by Allied intelligence, which had been lulled into complacency by having routinely forewarned of German moves via intercepted radio communications. 

Even today, when fiber-optic connections can transmit data at far greater rates than copper connections, the option to go "off air" when circumstances allow is still valuable. Of course, mobile troops will not always have the luxury of transferring all traffic onto cables, especially in high-speed scenarios, but there are still techniques to substantially minimize the footprint of communication signals and, in some cases, render them effectively undetectable. 


Frequency-hopping and spread-spectrum radios were two previous methods for making signals less visible to an eavesdropper. 


Although these approaches lower the RF footprint of transmissions, they are now vulnerable to detection, interception, and exploitation using wideband receivers and computer spectral analysis tools. Emerging technologies provide a variety of innovative approaches to achieve the same aim while improving security. 

The first is to use extremely directed ‘line of sight' signals that may be focused directly at the intended receiver, limiting an adversary's ability to even detect the broadcast. 

This might be accomplished, for example, by using tightly concentrated laser signals of various wavelengths that may be precisely directed at the desired recipient's antenna when geography allow. 

A space-based relay, in which two or more force components are linked by laser communication channels with a constellation of satellites, which are connected by secure links (see the following section for examples of ongoing work in that field), offers a difficult-to-intercept communications path. 

As a consequence, data might be sent with far less chance of being intercepted than RF signals. The distances between connecting parties are virtually unlimited for a satellite system with a worldwide footprint for its uplinks and downlinks. 

Moving radio signals to wavelengths that do not travel over long distances due to atmospheric absorption, but still give effective communications capabilities at small ranges, is a second strategy that is better suited to force elements in close proximity. 


The US Army, for example, is doing research on deep ultraviolet communications (UVC). 5 UVC has the following benefits over radio frequencies such as UHF and VHF: 


• the higher frequency allows for faster data transfer 

• very low-powered signals can still be heard over short distances 

• there is a quick drop-off in signal strength at a critical distance







Analog Space Missions: Earth-Bound Training for Cosmic Exploration

What are Analog Space Missions? Analog space missions are a unique approach to space exploration, involving the simulation of extraterrestri...