<![CDATA[ Latest from Live Science in Computing ]]> https://www.livescience.com 2025-04-14T16:00:10Z en <![CDATA[ What is quantum superposition and what does it mean for quantum computing? ]]>

an abstract illustration depicting quantum entanglement

(Image credit: VICTOR de SCHWANBERG/SCIENCE PHOTO LIBRARY via Getty Images)

Quantum physics is the branch of science that deals with the tiniest particles in the universe, such as atoms, electrons, photons (light particles), and other subatomic particles like quarks.

In the everyday world, at the scale that we can see, things tend to follow the laws of classical physics. However, when you zoom all the way in to the smallest particles, classical physics stops working quite as well, and the rules of quantum mechanics come into play.

Some of the key concepts of quantum physics are that particles like electrons can behave as waves, and vice versa (known as wave-particle duality); two particles can be linked in such a way that if you measure one, you instantly know something about the other (quantum entanglement); and a quantum particle can be in multiple states at once until it's observed (quantum superposition).

What is quantum superposition?

In everyday life, something can only be in one state at a time: a light switch is either on or off, a cat is either dead or alive. In the quantum world, things don't work quite the same way. Quantum superposition describes how a quantum particle, like an electron, a photon, or even an atom, can exist in multiple different states at the same time — until it's measured. Before it's observed, it's not halfway between states, but is instead a "superposition" of the two at once.

In quantum physics, the state of a particle is described by a wave equation, which tells us the probabilities of where a particle might be or what its properties might be. This probability wave can exist in a blend of multiple states.

What is Schrödinger’s Cat?

Schrödinger’s Cat is a famous thought experiment that illustrates how superposition works. Imagine a cat in a box with a mechanism that has a 50/50 chance of killing it, depending on whether or not a quantum particle decays radioactively, spontaneously changing into a different type of atom and releasing radioactive particles like electrons.

Until someone opens the box and observes it, the cat is considered to be in a superposition of both alive and dead. When you measure or observe the system — or in the case of Schrödinger’s Cat look inside the box — the superposition settles into one definite state, and the cat's fate is discovered.

Related: Physicists create hottest Schrödinger's cat ever in quantum technology breakthrough

Quantum superposition has been experimentally observed by scientists on multiple occasions. One famous example is the double-slit experiment, where photons are fired at a barrier with two slits, behind which is a a screen that records where the particles land. If you send particles through one slit, you get a single band on the screen, but if you open both, you get a wave-like interference pattern with multiple bands on the screen, which also proves that particles and waves can act like each other. Sending one particle at a time, you would expect each one to go through one slit or the other. However, the interference pattern still builds up, as if each single particle is interfering with itself. This means that each single particle is somehow going through both slits at once, and therefore is in a superposition of both possibilities

If you try to measure which slit the particle goes through, the superposition collapses: the particle does appear to have passed through a single slit, and the interference pattern disappears, leaving only two bands on the screen.

Additionally, ions and larger molecules have been experimentally trapped in a superposed state, and chlorophyll in the leaves of plants has been discovered to use quantum superposition to more efficiently harvest light from the sun.

Why is superposition so important in quantum computing?

Quantum superposition is also used as a tool in quantum computing and is the main reason quantum computers can be so powerful.

A classical binary bit can only be in one state at a time: 0 or 1. These bits are encoded on transistors, usually made from silicon, germanium or other semiconductors. With three bits present, they can have a potential of 8 different states: 000, 001, 010, 011, 100, 101, 110, and 111. To process all possibilities, a classical computer has to check them one at a time.

In quantum computers, particles such as electrons or photons act as a qubit (quantum bit), which can be in a superposition of both 0 and 1. Three qubits can be in a superposition of all 8 possible states at once, meaning that quantum computers can process a much larger number of calculations simultaneously. With three qubits present, a quantum computer could process all eight states listed above at once.

This much greater processing power than traditional computers could mean that quantum computers could one day be used to perform complex simulations in pharmaceuticals, climate modeling, and manufacturing. In theory, a quantum computer powerful enough can perform calculations in seconds that would have taken the most powerful supercomputers millions of years to complete.

When is World Quantum Day?

World Quantum Day, an international celebration held to promote public understanding of quantum science, is held annually on April 14.

The date, 4/14, was chosen because 4.14 represents the first three digits of Planck’s constant (4.135667696 x 10-15 electron volts per hertz, rounded to 4.14 x 10-15) — an important number in quantum physics.

]]>
https://www.livescience.com/technology/computing/what-is-quantum-superposition-and-what-does-it-mean-for-quantum-computing UF4PwXzbx7Htu2KmitZQa9 Mon, 14 Apr 2025 16:00:10 +0000
<![CDATA[ Quantum computing breakthrough could make 'noise' — forces that disrupt calculations — a thing of the past ]]> Scientists have discovered a groundbreaking method to shield quantum information from "noise" — and it could finally let us build practical quantum computers.

Quantum computers rely on quantum entanglement, the connection between the quantum properties of two particles that are shared instantaneously across time and space. This enables quantum computers to perform faster calculations than their traditional counterparts because they can process information in parallel rather than in sequence.

But maintaining this "coherence" is difficult due to "noise" from the outside world, as interactions with loose particles, rays of light and even minute changes in temperature can break the entanglement and disperse the information within. That's why the error rate in qubits is much higher than in conventional bits in classical computing.

"Basically even though companies claim [they have] 1,000 qubits, very few of them are useful. Noise is the reason," study co-author Andrew Forbes, a professor of physics at the University of Witwatersrand in Johannesburg, South Africa told Live Science. "Everyone agrees that there is no point in pushing for more qubits unless we can make them less noisy."

Now, by encoding the information in the topology (or the properties that stem from the shape) of two entangled photons, a team of physicists has found a way to preserve quantum information, even amid a storm of noise. The researchers published their findings on March 26 in the journal Nature Communications.

Related: MIT invents new way for QPUs to communicate — paving the way for a scalable 'quantum supercomputer'

In much the same way that traditional computer bits are the basic units of digital information, qubits encode quantum information. Like bits, qubits can exist as a 1 or a 0, representing the two possible positions in a two-state system.

Thanks to the bizarre rules of the quantum world, qubits can also exist in theoretically infinite superpositions of the two classical states. And when they're entangled inside quantum computers, their ability to crunch numbers grows exponentially.

But this quantum daisy chain is fragile: Even when housed inside extremely cold and highly insulated cryostats, current quantum computers are still infiltrated by tiny disturbances that rapidly disrupt the delicate processes within.

Quantum noise-cancelation

The typical strategy for preventing quantum decoherence is to preserve entanglement, but this has so far only enjoyed relative success. To look for a way around this, the researchers behind the new study sought to preserve information even in systems that had already been partially decohered.

"We decided to let the entanglement decay — it is always fragile so let it be so — and instead preserve information even with very little entanglement," Forbes said.

For their solution, Forbes and his colleagues turned to a type of qubit known as a "topological qubit" that encodes information in the shape made by two entangled particles. They settled on a quasiparticle known as an optical skyrmion, a wave-like field formed between two entangled photons.

After exposing the skyrmions to varying levels of noise, the researchers found that the patterns and information coded within remained resilient far beyond the point where non-topological systems would decohere.

"It turns out that so long as some entanglement remains, no matter how little, the topology stays intact," Forbes said. "The topology only disappears when the entanglement vanishes."

The scientists believe their approach could play a key role in making quantum computers and networks that can overcome noise in any environment. Their next step will be to create a "topological toolkit" that can encode practical information into a skyrmion and get it out again.

"Once we have this, we can start to think about using topology in practical situations, like communication networks and in computing," Forbes said.

]]>
https://www.livescience.com/technology/computing/breakthrough-that-shields-quantum-information-from-noise-brings-a-quantum-internet-one-step-closer rh8VFctc4mUSvMzQ8Ntwr6 Wed, 09 Apr 2025 12:13:20 +0000
<![CDATA[ MIT invents new way for QPUs to communicate — paving the way for a scalable 'quantum supercomputer' ]]> Researchers have created a device that allows quantum processors to communicate with each other directly — an important step in developing practical quantum computers. It could mean both faster and less error-prone communication between processors.

Existing quantum architecture offers only limited communication between separate quantum processing units (QPUs). Such communication is "point-to-point," meaning that information has to be transferred in a chain across several nodes before reaching its destination. This increases the possibility of exposing the quantum information to noise and makes it more likely for errors to occur.

However, the new device developed by MIT scientists allows for "all-to-all" communication, so that all processors in a single network can communicate directly with any other processor. The researchers outlined their "remote entanglement" approach in a new study published March 21 in the journal Nature Physics.

Remote entanglement is a state where two particles share the same state, and changes to one automatically affect the other. The distance between the two can be vast, with no currently known limit.

In testing, the researchers connected two quantum processors by way of modules, each comprising four qubits. Some of the qubits in each module were tasked with sending photons, light particles that can be used to transmit quantum data, while others were assigned to storing data.

The modules were linked together with a superconducting wire called a waveguide, with the modules serving as an interface between the larger quantum processors and the waveguide. The scientists said that any number of processors could be connected in this way, creating a highly scalable network.

The researchers then used microwave pulses to spark an individual qubit into emitting photons in either direction across the waveguide.

Related: China achieves quantum supremacy claim with new chip 1 quadrillion times faster than the most powerful supercomputers

"Pitching and catching photons enables us to create a ‘quantum interconnect’ between nonlocal quantum processors, and with quantum interconnects comes remote entanglement,” said senior author of the study William D. Oliver, Associate Director of the Research Laboratory of Electronics at MIT, in a statement.

Photonic distortion

Entanglement is a state where two particles become connected and share information, even at vast distances. A change in one entangled particle will immediately affect its partner. It’s a critical phenomenon for quantum computing because it allows qubits to be correlated and act as a single system. This, in turn, lets us create algorithms that are impossible with classical computers.

However, just moving photons back and forth between modules doesn’t automatically create entanglement. To achieve that, the team had to specially prepare both the qubits and the photon, so that after being transferred, the modules shared a single photon.

To force the two modules to share the same photon, they had to interrupt photon emission pulses at the halfway point. This essentially meant that half of the photon was absorbed on the receiving end while half was retained by the emitting module.

The problem with this method is that the photon becomes distorted while traveling across the waveguide, which can impact absorption and interrupt entanglement. To overcome this flaw in the architecture, the team had to distort the photons to encourage maximum absorption. By distorting photons prior to transmission, they were able to raise absorption levels to 60%, enough to ensure entanglement.

The work is broadly applicable to practical quantum computing applications, according to lead author of the study Aziza Almanakly, an electrical engineering and computer science graduate student.

"In principle, our remote entanglement generation protocol can also be expanded to other kinds of quantum computers and bigger quantum internet systems," Almanakly said.

]]>
https://www.livescience.com/technology/computing/mit-invents-new-way-for-qpus-to-communicate-paving-the-way-for-a-scalable-quantum-supercomputer ugv8XYtHF5tQhr3yavBBTC Sun, 06 Apr 2025 12:00:20 +0000
<![CDATA[ Mini desktop supercomputer coming this year — powerful enough to run advanced AI models and small enough to fit in your bag ]]> Nvidia has unveiled a line of artificial intelligence (AI) supercomputers that can deliver unprecedented processing power in a portable, desktop-friendly chassis.

Previously dubbed Project Digits, the powerful machines first revealed at CES 2025 in January have been rebranded as the DGX Spark and DGX Station machines. These computers are powered by Nvidia’s Blackwell Ultra platform and promise up to a petaFLOP in processing power — upwards of 1,000 times faster than the best laptops or high-end desktop PCs.

Blackwell Ultra is designed for massive-scale AI training and testing, and the DGX machines promise to put that power into the hands of data scientists, AI researchers and students at a relatively affordable price point. It’s the equivalent of putting the power of a data center into a computer small enough to fit on your desk.

The DGX Spark is a little box smaller than a laptop that you could easily tuck away on a corner of your desk or fit into your bag. It stands just under 2 inches (5 centimeters) in height and slightly under 6 inches (15 cm) in width, and at its core is the GB10 Grace Blackwell Superchip, capable of delivering 1,000 AI TOPS (trillions of operations per second).

Related: The 9 most powerful supercomputers in the world right now

It also comes with 128 gigabytes of unified system memory and Nvidia’s full stack of AI software, including a number of tools, libraries as well as some pretrained models. This includes things like the CUDA Deep Neural Network (cuDNN) library for enhancing neural network layers during training and inference and a pre-trained SegFormer model. The Nvidia version of the DGX Spark is available to reserve online starting at $3,999, although the company has said other models will soon be available from manufacturers like ASUS, Dell and Lenovo.

Supercomputing power in a desktop tower chassis

The DGX Station is the Spark’s larger, more powerful sibling and is closer in size to a professional workstation.

Built around the GB300 Grace Blackwell Ultra Desktop Superchip, it contains a staggering 748 GB of "large coherent" memory — memory which can be accessed by more than one processor at a time.

It also features Nvidia’s ConnectX-8 SuperNIC, which enables network connectivity at a blistering rate of up to 800 gigabits a second — fast enough to download approximately five 4K movies in a second.

It also uses the NVLink-C2C Interconnect to connect internal components at 900 gigabytes per second. The DGX Station is a powerhouse designed to execute large-scale AI training and inferencing workloads from the comfort of your desktop, without having to access additional resources through the cloud.

The DGX Station isn’t currently available to reserve, but Nvidia has indicated it will be available later in 2025 from partners like Asus, Dell, HP, Lambda and Supermicro.

Jensen Huang, Nvidia’s founder and CEO, said the new DGX machines represent a natural next step in AI development.

"AI has transformed every layer of the computing stack. It stands to reason a new class of computers would emerge, designed for AI-native developers and to run AI-native applications,” he said in a statement. "With these new DGX personal AI computers, AI can span from cloud services to desktop and edge applications."

]]>
https://www.livescience.com/technology/computing/mini-desktop-supercomputer-coming-this-year-powerful-enough-to-run-advanced-ai-models-and-small-enough-to-fit-in-your-bag ovA3WuqMtSCHyjnMuqDVGR Sat, 05 Apr 2025 13:00:00 +0000
<![CDATA[ Quantum computers will be a dream come true for hackers, risking everything from military secrets to bank information. Can we stop them? ]]> Quantum computers are coming. And when they arrive, they are going to upend the way we protect sensitive data.

Unlike classical computers, quantum computers harness quantum mechanical effects — like superposition and entanglement — to process and store data in a form beyond the 0s and 1s that are digital bits. These "quantum bits" — or qubits — could open up massive computing power.

That means quantum computers may solve complex problems that have stymied scientists for decades, such as modeling the behavior of subatomic particles or cracking the "traveling salesman" problem, which aims to calculate the shortest trip between a bunch of cities that returns to its original destination. But this massive power also may give hackers the upper hand.

an image that says

Science Spotlight takes a deeper look at emerging science and gives you, our readers, the perspective you need on these advances. Our stories highlight trends in different fields, how new research is changing old ideas, and how the picture of the world we live in is being transformed thanks to science.

"Like many powerful technologies, you can use [quantum computing] for great good," Rebecca Krauthamer, a technological ethicist and CEO of cybersecurity firm QuSecure, told Live Science. "And you can also use it for malicious purposes."

When usable quantum computers first come online, most people — and even most large organizations — will still rely on classical computers. Cryptographers therefore need to come up with ways to protect data from powerful quantum computers, using programs that can run on a regular laptop.

That's where the field of post-quantum cryptography comes in. Several groups of scientists are racing to develop cryptographic algorithms that can evade hacking by quantum computers before they are rolled out. Some of these cryptographic algorithms rely on newly developed equations, while others are turning to centuries-old ones. But all have one thing in common: They can't be easily cracked by algorithms that run on a quantum computer.

"It's like a foundation for a three-story building, and then we built a 100-story skyscraper on it."

Michele Mosca, co-founder and CEO of cybersecurity company evolutionQ

The foundations of cryptography

Cryptography dates back thousands of years; the earliest known example is a cipher carved into ancient Egyptian stone in 1900 B.C. But the cryptography used by most software systems today relies on public key algorithms. In these systems, the computer uses algorithms — which often involve factoring the product of two large prime numbers — to generate both a public key and a private key. The public key is used to scramble the data, while the private key, which is available only to the sender, can be used to unscramble the data.

To crack such cryptography, hackers and other malefactors often must factor the products of very large prime numbers or try to find the private key by brute force — essentially throwing out guesses and seeing what sticks. This is a hard problem for classical computers because they have to test each guess one after another, which limits how quickly the factors can be identified.

a close-up of a quantum computer

A close-up of a quantum computer being built by the German start-up IQM. (Image credit: dpa picture alliance via Alamy)

A 100-story skyscraper on a three-story building

Nowadays, classical computers often stitch together multiple encryption algorithms, implemented at different locations, such as a hard disk or the internet.

"You can think of algorithms like building bricks," Britta Hale, a computer scientist at the Naval Postgraduate School, told Live Science (Hale was speaking strictly in her capacity as an expert and not on behalf of the school or any organization.) When the bricks are stacked, each one makes up a small piece of the fortress that keeps out hackers.

But most of this cryptographic infrastructure was built on a foundation developed in the 1990s and early 2000s, when the internet was much less central to our lives and quantum computers were mainly thought experiments. "It's like a foundation for a three-story building, and then we built a 100-story skyscraper on it," Michele Mosca, co-founder and CEO of cybersecurity company evolutionQ, told Live Science. "And we're kind of praying it's OK."

It might take a classical computer thousands or even billions of years to crack a really hard prime factorization algorithm, but a powerful quantum computer can often solve the same equation in a few hours. That's because a quantum computer can run many calculations simultaneously by exploiting quantum superposition, in which qubits can exist in multiple states at once. In 1994, American mathematician Peter Shor showed that quantum computers can efficiently run algorithms that will quickly solve prime-number factoring problems. As a result, quantum computers could, in theory, tear down the cryptographic fortresses we currently use to protect our data.

Post-quantum cryptography aims to replace obsolete building blocks with less-hackable bricks, piece by piece. And the first step is to find the right math problems to use. In some cases, that means returning to equations that have been around for centuries.

Currently, the National Institute of Standards and Technology (NIST) is looking at four problems as potential foundations for post-quantum cryptography. Three belong to a mathematical family known as structured lattices. These problems ask questions about the vectors — mathematical terms that describe direction and magnitude between interconnected nodes — like the connection points in a spiderweb, Mosca said. These lattices can theoretically have an infinite number of nodes and exist in multiple dimensions.

Experts believe lattice problems will be hard for a quantum computer to crack because, unlike some other cryptographic algorithms, lattice problems don't rely on factoring massive numbers.

Instead, they use the vectors between nodes to create a key and encrypt the data. Solving these problems may involve, for example, calculating the shortest vector in the lattice, or trying to determine which vectors are closest to one another. If you have the key — often a "good" starting vector — these problems may be relatively easy. But without that key, they are devilishly hard. That's because no one has devised an algorithm, like Shor's algorithm, that can efficiently solve these problems using quantum computing architecture.

An infographic showing how lattice-based cryptography works

(Image credit: IBM Research via Science Photo Library)

The fourth problem that NIST is considering belongs to a group called hash functions. Hash functions work by taking the virtual key for unlocking a specific point on a data table, scrambling that key and compressing it into a shorter code. This type of algorithm is already a cornerstone of modern cybersecurity, so in theory, it should be more straightforward to upgrade classical computers to a quantum-proof version compared with other post-quantum cryptographic schemes, Mosca said. And similarly to structured lattices, they can't easily be solved by brute force alone; you need some clue as to what's going on inside the "black box" key generator to figure them out within the age of the universe.

But these four problems don't cover all of the potentially quantum-safe algorithms in existence. For example, the European Commission is looking at an error-correcting code known as the McEliece cryptosystem. Developed more than 40 years ago by American engineer Robert McEliece, this system uses random number generation to create a public and private key, as well as an encryption algorithm. The recipient of the private key uses a fixed cipher to decrypt the data.

McEliece encryption is largely considered both faster and more secure than the most commonly used public-key cryptosystem, called Rivest-Shamir-Adleman. As with a hash function, would-be hackers need some insight into its black-box encryption to solve it. On the plus side, experts consider this system very safe; on the downside, even the keys to unscramble the data must be processed using extremely large, cumbersome matrices, requiring a lot of energy to run.

A similar error-correcting code, known as Hamming Quasi-Cyclic (HQC), was recently selected by NIST as a backup to its primary candidates. Its primary advantage over the classic McEliece system is that it utilizes smaller key and ciphertext sizes.

Another type of algorithm that sometimes comes up in conversations about post-quantum cryptography is the elliptic curve, Bharat Rawal, a computer and data scientist at Capitol Technology University in Maryland, told Live Science. These problems go back at least to ancient Greece. Elliptic curve cryptography exploits basic algebra — calculating the points on a curved line — to encrypt keys. Some experts believe a new elliptic curve algorithm could evade hacking by a quantum computer. However, others argue that a hacker could hypothetically use Shor's algorithm on a quantum computer to break most known elliptic curve algorithms, making them a less-secure option.

a close-up of a computer chip

A close-up of a qubit chip at the Fujitsu laboratory laboratory in Tokyo. (Image credit: Aflo Co. Ltd. via Alamy)

No silver bullet

In the race to find quantum-safe cryptographic equations, there won't be a silver bullet or a one-size-fits-all solution. For example, there's always a trade-off in processing power; it wouldn't make much sense to use complex, power-hungry algorithms to secure low-priority data when a simpler system might be perfectly adequate.

"It's not like one algorithm [combination] will be the way to go; it depends on what they're protecting," Hale said.

In fact, it's valuable for organizations that use classical computers to have more than one algorithm that can protect their data from quantum threats. That way, "if one is proven to be vulnerable, you can easily switch to one that was not proven vulnerable," Krauthamer said. Krauthamer's team is currently working with the U.S. Army to improve the organization's ability to seamlessly switch between quantum-safe algorithms — a feature known as cryptographic agility.

Even though useful (or "cryptographically relevant") quantum computers are still several years away, it is vital to start preparing for them now, experts said. "It can take many years to upgrade existing systems to be ready for post-quantum cryptography," Douglas Van Bossuyt, a systems engineer at the Naval Postgraduate School, told Live Science in an email. (Van Bossuyt was speaking strictly as a subject-matter expert and not on behalf of the Naval Postgraduate School, the Navy or the Department of Defense.) Some systems are tough to upgrade from a coding standpoint. And some, such as those aboard military craft, can be difficult — or even impossible — for scientists and engineers to access physically.

Other experts agree that post-quantum cryptography is a pressing issue. "There's also the chance that, again, because quantum computers are so powerful, we won't actually know when an organization gets access to such a powerful machine," Krauthamer said.

There's also the threat of "harvest-now, decrypt-later" attacks. Malicious actors can scoop up sensitive encrypted data and save it until they have access to a quantum computer that's capable of cracking the encryption. These types of attacks can have a wide range of targets, including bank accounts, personal health information and national security databases. The sooner we can protect such data from quantum computers, the better, Van Bossuyt said.

And as with any cybersecurity approach, post-quantum cryptography won't represent an end point. The arms race between hackers and security professionals will continue to evolve well into the future, in ways that we can only begin to predict. It may mean developing encryption algorithms that run on a quantum computer as opposed to a classical one or finding ways to thwart quantum artificial intelligence, Rawal said.

"The world needs to keep working on this because if these [post-quantum equations] are broken, we don't want to wait 20 years to come up with the replacement," Mosca said.

]]>
https://www.livescience.com/technology/computing/quantum-computers-will-be-a-dream-come-true-for-hackers-risking-everything-from-military-secrets-to-bank-information-can-we-stop-them 5zLESLWd9DCKAnnxGyZcUZ Fri, 04 Apr 2025 16:00:10 +0000
<![CDATA[ World's first light-powered neural processing units (NPUs) could massively reduce energy consumption in AI data centers ]]> A light-powered computer chip designed to drive artificial intelligence (AI) data centers and make high-performance computing (HPC) more sustainable has entered production.

In a statement published Feb. 24, representatives from analog photonic chip company Q.ANT said its photonic AI chip could deliver a 30-fold increase in energy efficiency and a 50-fold boost in computing speed compared with conventional, silicon-based computer chips.

Pilot production of the new chip is now underway at IMS Chips in Stuttgart, Germany, where Q.ANT has invested 14 million euros ($15.1 million) to repurpose an existing semiconductor factory to fabricate its new, light-powered chip.

Because the chip is being produced on a repurposed facility instead of a specialist production line, the company believes it can bring the technology to market much more quickly. The chip can also integrate with the existing HPC servers, potentially accelerating adoption, Q.ANT representatives said.

"By 2030, we aim to make our photonic processors a scalable, energy-efficient cornerstone of AI infrastructure," Michael Förtsch, chief executive of Q.ANT, said in the statement.

Photonic computing

Photonic chips could solve a massive challenge faced by existing processor technology, particularly as AI and other data- and resource-intensive computing applications grow.

Traditional silicon chips control electrical signals using tiny switches called transistors. Photonic chips, by contrast, process data using light particles (photons), which are massless and can travel much faster than electrons do in conventional computer chips.

Photons don't emit heat in the same way electrons carrying an electrical charge do. As such, using photonic chips in applications involving complex, energy-intensive computations like AI could overcome the limitations of classic silicon chip architecture and thus vastly accelerate the computers' processing speed and reduce their energy consumption.

Related: 'Crazy idea' memory device could slash AI energy consumption by up to 2,500 times

"This comes at a critical time for the computing industry, as the exponential growth of AI and data-intensive applications will soon overwhelm the current data center infrastructure," Jens Anders, a professor at the University of Stuttgart and director and chief executive of IMS Chips, said in the statement. Anders added that the two companies aimed to establish "a scalable model for energy-efficient computing."

Q.ANT's chip is built using thin-film lithium niobate (TFLN), a crystalline compound applied to a wafer that forms the basis of the company's photonic chip. TFLN is increasingly catching the attention of photonics researchers and quantum scientists for its potential in next-generation computing. When an electric field is applied to the material, it can be used to control the speed and phase of light waves, thereby enabling it to modulate optical signals with extreme precision.

The pilot production line has been set up specifically to manufacture chips that incorporate TFLN, with Q.ANT aiming to fabricate 1,000 wafers per year.

"As AI and data-intensive applications push conventional semiconductor technology to its limits, we need to rethink the way we approach computing at the core," Förtsch said. "With this pilot line, we are accelerating time to market and laying the foundation for photonic processors to become standard coprocessors in high-performance computing."

]]>
https://www.livescience.com/technology/computing/worlds-first-light-powered-neural-processing-units-npus-could-massively-reduce-energy-consumption-in-ai-data-centers SLYopp3XEbTypQZ89paSGD Thu, 03 Apr 2025 11:30:00 +0000
<![CDATA[ Scientists create ultra-efficient magnetic 'universal memory' that consumes much less energy than previous prototypes ]]> Scientists in Japan have developed a new kind of "universal" computing memory that is much faster and less energy-hungry than modules used in the best laptops and PCs today.

Magnetoresistive Random Access Memory (MRAM) is a type of universal memory device that can overcome some of the limitations of conventional RAM, which can slow down at peak demand due to a relatively low capacity. Universal memory is a storage format that combines the speed of existing RAM and the ability of storage to retain information without a power supply

Universal memory like MRAM is a better proposition than the components used in computers and smart devices today as it offers higher speeds and much greater capacity, as well as better endurance.

This new technology operates at faster speeds and with greater capacity than conventional RAM, but overcomes the problem of high power requirements for data writing — which has previously been a challenge for MRAM.

MRAM devices consume little power in their standby state but need a large electric current to switch the direction of magnetization vector configurations of magnetic tunnel junctions, thereby using the direction of magnetization to represent the binary values in computers. That makes it infeasible for use in most computing systems and to achieve low-power data writing, a more efficient method for switching these vectors was needed.

Related: 'Quantum memory breakthrough' may lead to a quantum internet

In a paper published Dec. 25 2024 in the journal Advanced Science, researchers reported developing a new component for controlling the electric field in MRAM devices. Their method requires far less energy to switch polarity, thereby lowering the power requirements and improving the speed at which processes are performed.

Next-generation computing memory

The prototype component they built was called a "multiferroic heterostructure" — a ferromagnetic material and piezoelectric material, but with an ultrathin vanadium between them — that can be magnetized by an electric field. This differs from other MRAM devices, which did not have the vanadium layer.

Structural fluctuations in the ferromagnetic layer meant that it was difficult for a stable direction of magnetization to be maintained in previous MRAM devices. In order to overcome this stability issue, the Vanadium wafer between the ferromagnetic and piezoelectric layers acts as a buffer between the two.

By passing an electric current through the materials, the scientists demonstrated that the magnetic state could switch direction. The materials could maintain their shape and form, which previous versions could not do. Furthermore, the magnetic state was maintained after the electric charge was no longer present, allowing a stable binary state to be maintained without power.

The study did not cover the degradation in the switching efficiency over time. This tends to be a common problem with a wide range of electrical devices. For example, a common complaint with rechargeable household batteries is that they can only be charged a certain number of times (approximately 500) before their capacity degrades.

Ultimately, the new MRAM technology could enable more powerful commercial computing while also offering a longer use life, the scientists said. That's because the new switching technique requires far less power than previous solutions, has a greater resilience than current RAM technologies and does not require moving parts.

]]>
https://www.livescience.com/technology/computing/scientists-create-magnetic-ultra-efficient-universal-memory-that-consumes-much-less-energy-than-previous-prototypes VVDN4AjTGziDHMJKyaahFM Fri, 21 Mar 2025 12:00:00 +0000
<![CDATA[ China achieves quantum supremacy claim with new chip 1 quadrillion times faster than the most powerful supercomputers ]]> Researchers in China have developed a quantum processing unit (QPU) that is 1 quadrillion (10¹⁵) times faster than the best supercomputers on the planet.

The new prototype 105-qubit chip, dubbed "Zuchongzhi 3.0," which uses superconducting qubits, represents a significant step forward for quantum computing, scientists at the University of Science and Technology of China (USTC) in Hefei said.

It rivals the benchmarking results set by Google's latest Willow QPU in December 2024 that allowed scientists to stake a claim for quantum supremacy — where quantum computers are more capable than the fastest supercomputers — in lab-based benchmarking.

The scientists used the processor to complete a task on the widely used quantum computing random circuit sampling (RSC) benchmark in just a few hundred seconds, they said in a new study published March 3 in the journal Physical Review Letters.

This test, 83-qubit, 32-layer random circuit sampling task, was also completed 1 million times faster than the result set by Google's previous generation Sycamore chip, published in October 2024. Frontier, the second-fastest supercomputer in the world, would only be able to complete the same task in 5.9 billion years, by contrast

Related: World's 1st modular quantum computer that can operate at room temperature goes online

Although the results suggest QPUs are capable of achieving quantum supremacy, the specific RCS benchmarking used favors quantum methods. Also, improvements in classical algorithms that drive classical computing may close the gap, as happened in 2019 when Google scientists first announced a quantum computer had outperformed a classical computer — in the first use of the RSC benchmark.

"Our work not only advances the frontiers of quantum computing, but also lays the groundwork for a new era where quantum processors play an essential role in tackling sophisticated real-world challenges," the scientists said in the study.

Rivaling Google's best quantum processor

The latest iteration of Zuchongzhi includes 105 transmon qubits — devices made from metals like tantalum, niobium, and aluminum that have reduced sensitivity to noise — in a 15-by-7 rectangular lattice. This builds on the previous chip, which included 66 qubits.

One of the most important areas critical to the viability of quantum computing in real-world settings is coherence time, a measure of how long a qubit can maintain its superposition and tap into the laws of quantum mechanics to perform calculations in parallel. Longer coherence times mean more complicated operations and calculations are possible.

Another major improvement was in gate fidelity and quantum error correction, which has been an obstacle to building useful quantum computers. Gate fidelity measures how accurately a quantum gate performs its intended operation, where a quantum gate is analogous to a classical logic gate, performing a specific operation on one or more qubits, manipulating their quantum state. Higher fidelity qubits mean fewer errors and more accurate computations.

Zuchongzhi 3.0 performed with an impressive parallel single-qubit gate fidelity of 99.90%, and a parallel two-qubit gate fidelity of 99.62%. Google's Willow QPU edged it slightly, with results of 99.97% and 99.86% respectively.

These improvements were largely possible due to engineering improvements, including enhancements in fabrication methods and better optimized qubits design, the scientists said in the study. For instance, the latest iteration lithographically defines qubit components using tantalum and aluminum, bonded through an indium bump flip-chip process. This improves accuracy and minimizes contamination.

]]>
https://www.livescience.com/technology/computing/china-achieves-quantum-supremacy-claim-with-new-chip-1-quadrillion-times-faster-than-the-most-powerful-supercomputers yzxbKqkeFuSn3g89w5StDn Thu, 13 Mar 2025 13:00:20 +0000
<![CDATA[ World's 1st modular quantum computer that can operate at room temperature goes online ]]> Scientists have developed a quantum computer that uses light to process data, paving the way for quantum computers that can operate in a networked environment at room temperature.

The new system, called Aurora, is the first photonic quantum computer in the world that can operate at scale using several modules interconnected through fiber optic cables. The system presents a solution to some of quantum computing's biggest problems — namely operation at scale, fault tolerance and error correction, Xanadu representatives say.

This breakthrough could lead to the creation of viable quantum data centers with higher fault tolerance and lower error rates than we can otherwise achieve today, the researchers said in a study published Jan. 22 in the journal Nature.

"The two big challenges remaining for the industry are the improved performance of the quantum computer (error correction and fault tolerance) and scalability (networking)," Christian Weedbrook, the founder and CEO of Xanadu, the company behind the new system, said in a statement.

Traditional qubits, or superconducting qubits, are the building blocks of quantum computing and hold the key to processing massive amounts of data quickly.

Related: Quantum internet breakthrough after 'quantum data' transmitted through standard fiber optic cable for 1st time

But these qubits use microwave signals to help process data, which creates heat that can damage hardware. Further, current cooling methods, which are used to create a near absolute zero computing environment, also damage hardware and make accessing machines difficult.

By using light-based, or photonic, qubits instead of microwave or superconducting qubits, Weedbrook and his team created a light-based system that uses networked photonic chips. This makes Aurora inherently connectible, as fiber optics make up the basis of the global networking system.

Light-powered quantum computing networks

Aurora's developers posit that by breaking quantum computers into smaller, less error-prone components, they can strengthen quantum error correction by interconnecting the units.

"The fundamental problem of fault tolerance and finding ways to error-correct the quantum states faster than the errors occur remains a big challenge to performing any useful computations," said Darran Milne, doctor of quantum information theory and CEO of tech company VividQ, who was not involved in the project.

"Rather than trying to compute with a single large quantum computer it seems they [Xanadu] are trying to split it into smaller simpler systems that might be easier to error-correct individually," Milne told Live Science. "It remains to be seen if that actually makes the problem any better or just multiplies the errors."

The framework relies on technology used in the company's X8 (quantum computing hardware) and Borealis (single-system quantum computer). The system utilizes 35 photonic chips connected through 8 miles (13 kilometers) of fiber optic cables.

"Photonics really is the best and most natural way to both compute and network," the researchers said in the statement. "We now could, in principle, scale up to thousands of server racks and millions of qubits."

Potential applications of the Aurora photonic quantum computer framework include simulating molecules and calculating potential outcomes of pharmaceutical trials, potentially eliminating the need for long drug trials. Photonic quantum computers might also usher in the age of highly secure, encrypted communications known as quantum cryptography.

The team at Xanadu next plan to focus on eliminating weakened fiber optic signals due to optical loss.

]]>
https://www.livescience.com/technology/computing/worlds-1st-modular-quantum-computing-data-center-that-can-operate-at-room-temperature-goes-online 5Rnent5UCj2viYoiTNaLKf Fri, 07 Mar 2025 12:30:00 +0000
<![CDATA[ Scientists discover simpler way to achieve Einstein's 'spooky action at a distance' thanks to AI breakthrough — bringing quantum internet closer to reality ]]> Scientists have used AI to discover an easier method to form quantum entanglement between subatomic particles, paving the way for simpler quantum technologies.

When particles such as photons become entangled, they can share quantum properties — including information — regardless of the distance between them. This phenomenon is important in quantum physics and is one of the features that makes quantum computers so powerful.

But the bonds of quantum entanglement have typically proven challenging for scientists to form. This is because it requires the preparation of two separate entangled pairs, then measuring the strength of entanglement — called a Bell-state measurement — on a photon from each of the pairs.

These measurements cause the quantum system to collapse and leave the two unmeasured photons entangled, despite them never having directly interacted with one another. This process of “entanglement swapping” could be used for quantum teleportation.

In a new study, published Dec. 2, 2024 in the journal Physical Review Letters, scientists used PyTheus, an AI tool that has been specifically created for designing quantum-optic experiments. The authors of the paper initially set out to reproduce established protocols for entanglement swapping in quantum communications. However, the AI tool kept producing a much simpler method to achieve quantum entanglement of photons.

Related: Quantum data beamed alongside 'classical data' in the same fiber-optic connection for the 1st time

"The authors were able to train a neural network on a set of complex data that describes how you set up this kind of experiment in many different conditions, and the network actually learned the physics behind it," Sofia Vallecorsa, a research physicist for the quantum technology initiative at CERN, who was not involved in the new research, told Live Science.

Tapping into AI to simplify quantum entanglement

The AI tool proposed that entanglement could emerge because the path of photons were indistinguishable: when there are several possible sources the photons could have come from, and if their origins become indistinguishable from one another, then entanglement can be produced between them when none existed before.

Although the scientists were initially skeptical of the results, the tool kept returning the same solution so they tested the theory. By adjusting the photon sources and ensuring they were indistinguishable, the physicists created conditions where detecting photons at certain paths guaranteed that two others emerged entangled.

This breakthrough in quantum physics has simplified the process by which quantum entanglement can be formed. In future, it could have implications for the quantum networks used for secure messaging, making these technologies much more feasible.

"The more we can rely on simple technology, the more we can increase the range of applications," Vallecorsa said. "The possibility to build more complex networks, that could branch out in different geometries, could have a big impact with respect to the single end-to-end case."

Whether it is practical to scale the technology into a commercially viable process remains to be seen, however, as environmental noise and device imperfections could cause instability in the quantum system.

The new study has also provided a convincing argument for the use of AI as a research tool by physicists. "We are looking more into introducing AI, but there is still a little bit of scepticism, mostly due to what the role of the physicist is going to be once we start going that way," Vallecorsa said. "It is an opportunity for getting a very interesting result and shows in a very compelling way how this can be a tool that physicists use."

]]>
https://www.livescience.com/technology/computing/scientists-discover-simpler-way-to-achieve-einsteins-spooky-action-at-a-distance-thanks-to-ai-bringing-quantum-internet-closer-to-reality 6aD8fpMsp636qfo3QexXdP Wed, 05 Mar 2025 13:00:10 +0000
<![CDATA[ AWS launches 'Ocelot' quantum processor — a chip inspired by Schrödinger's cat that corrects errors exponentially with scale ]]> Amazon Web Services (AWS) has launched a prototype quantum computing chip that is the first in the world to be fitted with error-resistant "cat qubits" — basic units of quantum computing information inspired by the famous Schrödinger's cat thought experiment.

The quantum processing unit (QPU), named "Ocelot," includes five data qubits, or cat qubits, to store information; five buffer circuits made from the superconductor tantalum to stabilize the cat qubits; and four additional qubits to detect errors that occur during data processing.

These internal components are divided across two integrated silicon microchips that each measure roughly 0.16 square inches (1 square centimeter), making the device small enough to fit on the tip of your finger.

The new architecture is designed to significantly reduce the cost and energy needed to slash errors that occur naturally in quantum computers — a challenge scientists are still trying to find a solution to (with progress made in a February 2024 study and another in April last year, among others).

Significantly, the researchers said the new technology could exponentially reduce errors as more qubits are added to future versions of the chip. They outlined their findings in a new study published Feb. 26 in the journal Nature.

Turning down the quantum noise

Because qubits are inherently "noisy" — meaning they're sensitive to disturbances from vibrations, heat, electromagnetic interference and radiation from space — they are far more prone to failing than classic bits. The error rate in classic bits is 1 in 1 million million, versus roughly 1 in 1,000 in qubits. This far higher error rate often leads to the collapse of any quantum superposition mid-calculation and failures when quantum computations are being performed.

The two types of error are bit-flip errors, where the probability of measuring 0 becomes the probability of measuring 1; and phase-flip errors, where a qubit rotates 180 degrees on its vertical axis. Bit-flip errors affect both bits and qubits, while phase-flip errors affect only qubits. The need to correct both types of error in quantum systems requires significant resources compared with error correction in classical computing.

Related: Google 'Willow' quantum chip has solved a problem the best supercomputer would have taken a quadrillion times the age of the universe to crack

Because of this, scientists say that a quantum computer would need millions of qubits before getting close to achieving "quantum supremacy" — which would be unfeasible in terms of the physical space, energy and resources required to build and run such a hypothetical machine. This is why more research is focused on building reliable qubits integrated with error correction technologies.

"Logical qubits" — which are made up of multiple physical qubits that store the same information to spread the points of failure — are the prevailing error-correction method. AWS researchers, however, say that without further improvements to the hardware, current approaches come at a huge and prohibitive cost, because they would need thousands of physical qubits to form one logical qubit capable of achieving low error rates.

Ocelot, however, adopts the cat qubit design developed by the French startup Alice & Bob. Named after the famous Schrödinger's cat thought experiment, this qubit is designed in such a way that it is inherently resistant to bit-flip errors.

Tapping into new 'cat qubits'

Unlike the conventional superconducting qubits used in machines built by the likes of IBM and Google that can achieve a superposition of 1 and 0, cat qubits can achieve a double superposition of two quantum states simultaneously. Alice & Bob scientists outlined how this technology works in a roadmap and white paper published in 2024.

The cat qubit uses a quantum superposition of classical-like states of well-defined amplitude and phase to encode information. It uses bosonic particles specifically to encode data — in this case, photons, or particles of light.

The more energy is pumped into the system, the more photons are created, and the more amplitudes, or oscillator states, can be accessed, which better protects quantum information. Increasing the number of photons in the oscillator can make the rate of bit-flip errors exponentially smaller, the scientists said. This means that, to reduce the error rate, you don't need to increase the qubit count; rather, you need to increase the energy of the oscillator.

Previous experiments over the last decade have shown the potential of cat qubits in single-qubit demonstrations, including a study from a different team in 2015 and one as recently as May 2024. A study published in January this year also outlined an approach to error correction that was inspired by Schrödinger's cat. However, AWS's Ocelot is the first example of a coherent multi-cat qubit system integrated into a chip built using existing fabrication methods.

AWS Ocelot quantum processing unit

The new quantum processor was used to demonstrate that the error rate reduced from 1.72% when using three cat qubits to 1.65% when using five cat qubits. (Image credit: AWS)

In the new study, the scientists demonstrated measurements taken with Ocelot that show bit-flip errors are exponentially suppressed at the physical qubit level, while phase-flip errors are corrected using the simplest error-correcting code, known as repetition code. The gates between the cat qubits and error-correcting qubits are also effective at detecting phase-flip errors, while preserving the power of the cat qubits to protect against bit-flip errors.

The results showed bit-flip times approaching 1 second — roughly 1,000 times longer than the lifetime of conventional superconducting qubits. This was accomplished using four photons, enabling phase-flip times measured in tens of microseconds, which is sufficient for quantum error correction.

The scientists then tested the system to determine how effective this architecture could be at behaving like a logical qubit. The total logical error rate was 1.72% when running code on three cat qubits, versus 1.65% when using five cat qubits. With nine qubits in total (five cat and four error-correcting), they hit error rates comparable to a system with 49 physical qubits.

Scalable quantum computing

The scientists estimate that using the architecture in Ocelot, a future quantum computer with "transformative societal impact" needs as little as one-tenth of the resources that would otherwise be needed with standard approaches to quantum error-correction.

"Future versions of Ocelot are being developed that will exponentially drive down logical error rates, enabled by both an improvement in component performance and an increase in code distance," co-authors of the study, Fernando Brandão, a Caltech professor of theoretical physics, and Oskar Painter, professor of applied physics at Caltech, said in a technical blog post. "Codes tailored to biased noise, such as the repetition code used in Ocelot, can significantly reduce the number of physical qubits required," they said.

"We believe that Ocelot's architecture, with its hardware-efficient approach to error correction, positions us well to tackle the next phase of quantum computing: learning how to scale," Brandão and Painter added. "Scaling using a hardware-efficient approach will allow us to achieve more quickly and cost-effectively an error-corrected quantum computer that benefits society."

]]>
https://www.livescience.com/technology/computing/new-ocelot-quantum-processor-inspired-by-schrodingers-cat-could-scale-up-quantum-computers-by-massively-slashing-errors DKPvu5DuXD73g75GCE4FcN Thu, 27 Feb 2025 15:45:10 +0000
<![CDATA[ Scientists create world's 1st chip that can protect data in the age of quantum computing attacks ]]> Engineers have demonstrated a new communications system designed to protect telecommunications against quantum computing attacks.

The system, called "QS7001," was presented on Jan. 22 by representatives of the Swiss semiconductor company SEALSQ at the World Economic Forum in Davos, Switzerland.

To protect data transmitted over the internet, from payment information to personal medical records, the contents of messages are encrypted.

Encryption scrambles information using mathematical problems so complex that they cannot be solved without a "key," which only the authorized parties (the sender and receiver) have access to. Although encryption does not in itself prevent interception of the message, it prevents anyone from reading the contents.

However, scientists theorize that the massive processing power of future quantum computers would allow them to solve complex equations in seconds, where classical computers would have taken millions of years. They therefore have the potential to break conventional encryption technologies, such as RSA encryption.

Related: Schrödinger's Cat breakthrough could usher in the 'Holy Grail' of quantum computing, making them error-proof

A weak 50-bit integer of RSA (NIST recommends a minimum of 2048-bit) encryption has already been broken using quantum computers. Global communications could be disrupted if people could no longer securely transmit messages over the internet free from the threat of interception.

The QS7001 system combines two quantum-resistant encryption protocols developed by NIST (Dilithium and Kyber) with a reduction in data transmission time — thereby closing the possible window of opportunity for attacks.

"It’s the evolution of the ever-present arms race between technology to keep us safe and technology that can be used to undo it," Dave Lear, a cybersecurity analyst, told Live Science.

Narrowing the window of opportunity

Quantum-resistant protocols are new encryption techniques that have proved resistant to quantum computing attacks — in that quantum computers are unable to solve the cryptographic key to the encryption and access the information. However, quantum computers are becoming increasingly powerful and could, in the future, even break encryption that is currently resistant to quantum attacks.

"The producers are claiming it’s quantum-resistant, but until it’s properly tested in the wild — and attacked by determined adversaries — we won’t know for sure," said Lear.

In the demonstration, it took a traditional secure microcontroller up to 1,500 milliseconds (one and a half seconds) to transmit sample data protected using the Dilithium encryption protocols. Using the SEALSQ’s QS7001 method, it took approximately 100 ms (one-tenth of a second) to transmit the same data.

This reduced transmission time was achieved by efficiently authenticating, signing and encrypting data while still adhering to the same stringent security certifications. This technique reduced the time that a quantum computer had to intercept and break the encryption of messages.

It is worth noting that this method does not prevent intercepted information from being copied and stored — and at that point, a quantum computer would not be constrained by the reduced transmission time. However, what the QS7001 does is narrow the window of opportunity for interception and prevent intercepted messages from being modified or misdirected.

There are also emerging quantum communication technologies that can be used to detect if a message is being intercepted and cancel the transmission. If QS7001 were to be combined with quantum communications, this could become a powerful tool for protecting our information on a post-quantum internet.

"If it takes longer to decrypt than that key is valid for, then your message is protected," says Lear. "Until they develop a faster tool, of course."

]]>
https://www.livescience.com/technology/computing/scientists-create-worlds-1st-chip-that-can-protect-data-in-the-age-of-quantum-computing-attacks spUUbE6UsWHofC4MW2HDcG Tue, 25 Feb 2025 12:00:00 +0000
<![CDATA[ AI-designed chips are so weird that 'humans cannot really understand them' — but they perform better than anything we've created ]]> Engineering researchers have demonstrated that artificial intelligence (AI) can design complex wireless chips in hours, a feat that would have taken humans weeks to complete.

Not only did the chip designs prove more efficient, the AI took a radically different approach — one that a human circuit designer would have been highly unlikely to devise. The researchers outlined their findings in a study published Dec. 30 2024 in the journal Nature Communications.

The research focused on millimeter-wave (mm-Wave) wireless chips, which present some of the biggest challenges facing manufacturers due to their complexity and need for miniaturization. These chips are used in 5G modems, now commonly found in phones.

Manufacturers currently rely on a mix of human expertise, bespoke circuit designs and established templates. Each new design then goes through a slow process of optimization, based on trial and error because it is often so complex that a human cannot fully understand what is happening inside the chip. This leads to a cautious, iterative approach based on what has worked before.

Related: 6G speeds hit 100 Gbps in new test — 500 times faster than average 5G cellphones

In this case, however, researchers at Princeton Engineering and the Indian Institute of Technology posited that deep-learning-based AI models could use an inverse design method — one that specifies the desired output and leaves the algorithm to determine the inputs and parameters.

The AI also considers each chip as a single artifact, rather than a collection of existing elements that need to be combined. This means that established chip design templates, the ones that no one understands but probably hide inefficiencies, are cast aside.

The future of chip design?

In this experiment, the resulting structures "look randomly shaped," said lead author Kaushik Sengupta, a professor of electrical and computer engineering at Princeton. "Humans cannot really understand them."

And when Sengupta’steam manufactured the chips, they found the AI creations hit performance levels beyond those of existing designs.

An enlarged image of the chip’s circuitry shows unusual patterns.

(Image credit: Princeton University)

Although the findings suggest that the design of such complex chips could be handed over to AI, Sengputa was keen to point out that pitfalls remain “that still require human designers to correct.” In particular, many of the designs produced by the algorithm did not work– equivalent to the "hallucinations" produced by current generative AI tools.

"The point is not to replace human designers with tools," said Sengputa. "The point is to enhance productivity with new tools."

The speed with which iterative designs can be developed opens up new possibilities, too. Some chip designs can be geared towards energy efficiency, others to outright performance or to extending the frequency range.

Wireless chips are of growing importance, with an ever-growing demand for miniaturization, so this research is a valuable step forward. But Sengupta said that if his team’s method can be extended to other parts of a circuit’s design, it could change the way we design electronics in the future. "This is just the tip of the iceberg in terms of what the future holds for the field."

]]>
https://www.livescience.com/technology/computing/humans-cannot-really-understand-them-weird-ai-designed-chip-is-unlike-any-other-made-by-humans-and-performs-much-better xiikbftj8TES5ZwZuHesND Thu, 20 Feb 2025 12:30:40 +0000
<![CDATA[ Breakthrough quantum chip that harnesses new state of matter could set us on the path to quantum supremacy ]]> Scientists at Microsoft have built a new quantum computing chip using a special category of material capable of tapping into a new state of matter. This breakthrough could enable researchers to build a single chip with millions of reliable qubits much sooner than experts predicted — possibly within just a few years rather than decades.

The new quantum processing unit (QPU), called "Majorana 1," is an eight-qubit prototype chip built from the first material of its kind in the world — a topological conductor, or topoconductor. This can reach the "topological" state of matter and tap into the laws quantum mechanics under the right conditions in order to process the 1s and 0s of computing data in a quantum computer.

The new type of qubit, called a "topological qubit," is stable, smaller, less power-draining and more scalable than a qubit made from a superconducting metal — the most common type of qubit used in quantum computers built by companies such as Google, IBM, and Microsoft itself.

"We took a step back and said 'Ok, let's invent the transistor for the quantum age. What properties does it need to have?'," Chetan Nayak, Microsoft technical fellow and professor of physics at the University of California Santa Barbara, said in a statement. "And that's really how we got here — it's the particular combination, the quality and the important details in our new materials stack that have enabled a new kind of qubit and ultimately our entire architecture."

Related: Quantum simulation breakthrough will lead to 'discoveries impossible in today's fastest supercomputers,' Google scientists claim

The fabrication of this QPU was only possible after researchers, for the first time, used the architecture to definitively observe and control an enigmatic subatomic particle with special properties called the "Majorana fermion," or "Majorana zero mode" (MZM), theorized by mathematician Ettore Majorana in 1937.

Ettore Majorana headshot from the 1930s

Majorana's theory proposed that a particle could be its own antiparticle, and may actually coexist rather than only annihalating each other. (Image credit: Getty Images/Mondadori Portfolio/Contributor)

Scientists have previously tried to create Majorana fermions to use for a new kind of quantum computing. Explorations of the Majorana fermion and its proposed use in quantum computers span many years, including a reported discovery of the particle in 2012 and in April 2024. Scientists in June 2023 also published a study reporting the discovery of the topological state of matter.

Majorana's theory proposed that a particle could be its own antiparticle. That means it's theoretically possible to bring two of these particles together, and they will either annihilate each other in a massive release of energy (as is normal) or can coexist stably when pairing up together — priming them to store quantum information.

These subatomic particles do not exist in nature, so to nudge them into being, Microsoft scientists had to make a series of breakthroughs in materials science, fabrication methods and measurement techniques. They outlined these discoveries — the culmination of a 17-year-long project — in a new study published Feb. 19 in the journal Nature.

This is a 'transistor for the quantum age'

Chief among these discoveries was the creation of this specific topoconductor, which is used as the basis of the qubit. The scientists built their topoconductor from a material stack that combined a semiconductor made of indium arsenide (typically used in devices like night vision goggles) with an aluminum superconductor.

The researchers needed the right combination of these components to trigger the desired transition into the new topological state of matter. They also needed to create very specific conditions to achieve this — namely, temperatures near absolute zero and exposure to magnetic fields. Only then could they usher MZMs into existence.

The Majorana-1 quantum computing chip

The new Majorana 1 Quantum processor has eight topolgoical qubits, each composed of superconducting and topological conducting wires fitted alongside MZMs and a semiconductor quantum dot. (Image credit: John Brecher for Microsoft)

To construct one qubit, which is less than 10 microns in size — much smaller than superconducting qubits — the scientists arranged a set of nanowires into an H shape, with two longer topoconducting wires joined in the center by one superconducting wire. They next induced four MZMs to exist on all four points of the H by cooling the structure down and tuning it with magnetic fields. Finally, to measure the signal when the device would be operational, they connected the H with a semiconductor quantum dot — equivalent to a small capacitor that carries charge.

Topoconductors differ from superconductors in the way they behave when burdened with an unpaired electron. In superconductors, electrons usually pair up — known as Cooper pairs — with an odd number of electrons (any unpaired electron) requiring a massive amount of energy to accommodate, or enter an excited state. The difference in energy between the ground state and the excited state is the basis for the 1s and 0s of data in superconducting qubits.

Like superconductors, topoconductors use the presence or absence of an unpaired electron as the 1s and 0s of computing data, but the material can "hide" unpaired electrons by sharing their presence among the paired electrons. This means there is no measurable energy difference when unpaired electrons are added into the system, making the qubit more stable at the hardware level and protecting the quantum information. However, it also means it's harder to measure the qubit's quantum state.

This is where the quantum dot comes in. The scientists beam a single electron from the quantum dot into one end of the wire, through the MZM, and it emerges from the other end, through another MZM. By blasting the quantum dot with microwaves as this happens, the returning reflection carries an imprint of the quantum state of the nanowires.

The accuracy of this measurement is approximately 99%, the scientists said in the study, noting that electromagnetic radiation is one example of an external factor that triggers an error once per millisecond, on average. The scientists said this is rare and indicates that the inherent shielding in the new type of processor is effective at keeping radiation out.

The path to a million qubits

"It's complex in that we had to show a new state of matter to get there, but after that, it's fairly simple. It tiles out. You have this much simpler architecture that promises a much faster path to scale," Krysta Svore, Microsoft's principal research manager, said in the statement.

Svore added this new qubit architecture, called the "Topological Core," represents the first step on the path to creating workable 1 million-qubit quantum computers — likening its creation to the shift from building computers using vacuum tubes to transistors in the 20th century.

This is thanks to the smaller size and higher quality of the qubits, alongside the ease with which they can scale because of the way the qubits fit together like tiles, the scientists said in the study.

In the next few years, the scientists plan to build a single chip with a million physical qubits, which will, in turn, lead to useful scientific breakthroughs in fields like medicine, materials science and our understanding of nature that would be impossible to make using the fastest supercomputers.

The Majorana-1 quantum computing chip

The scientists plan on improving this technology in the coming years to the extent we'll see a quantum chip with a million physical qubits. (Image credit: John Brecher for Microsoft)

The quantum chip does not work in isolation, however. Rather, it exists in an ecosystem alongside a dilution refrigerator to achieve extremely cold temperatures, a system that manages control logic, and software that can integrate with classical computers and artificial intelligence (AI). The scientists said that optimizing these systems so that they can work at a much bigger scale will take years of further research. But this timeline may be expedited with further breakthroughs.

"Those materials have to line up perfectly. If there are too many defects in the material stack, it just kills your qubit," Svore said in the statement. "Ironically, it's also why we need a quantum computer — because understanding these materials is incredibly hard. With a scaled quantum computer, we will be able to predict materials with even better properties for building the next generation of quantum computers beyond scale."

]]>
https://www.livescience.com/technology/computing/quantum-processor-that-uses-entirely-new-state-of-matter-could-set-us-on-the-path-to-quantum-supremacy tKfAomq6KaST89uz8WStbM Wed, 19 Feb 2025 16:00:10 +0000
<![CDATA[ Quantum simulation breakthrough will lead to 'discoveries impossible in today's fastest supercomputers,' Google scientists claim ]]> Scientists at Google have revealed a new method of "quantum simulation" that uses computing power to mimic the behavior of a powerful quantum system. This approach, they argue, could lead to quantum computers that can overtake supercomputers within five years and lead to breakthroughs in drug discovery and battery development.

Quantum simulation is a process in which computers simulate physical processes and large quantum systems, such as complex molecules. Essentially, engineers simulate physical processes that are dominated by the effects of quantum physics.

But this is difficult to do with classical computers because you have to model every particle's interaction with every other particle. Because subatomic particles have a probability of being in multiple states at once and can be entangled with each other, the complexity of these calculations skyrockets quickly as you scale the number of particles involved.

Instead, scientists are turning to quantum computers, whose behavior is already governed by the laws of quantum mechanics, to solve the problems. Because quantum physics is built into the way these systems work. If the qubits are entangled or linked together the right way, they can mimic bigger quantum systems without having to explicitly calculate every step in the evolution of the system.

That is where "quantum simulation" comes into play. There are two types of quantum simulation. Digital simulation lets researchers selectively pivot between quantum states by entangling and disentangling different qubit pairings (two entangled qubits) in series. Analog simulation, meanwhile, is much faster. This involves entangling all the qubits across a system at once — but since qubits can be error-prone, this raises the risk that the output of the simulation becomes meaningless noise.

Related: Google 'Willow' quantum chip has solved a problem the best supercomputer would have taken a quadrillion times the age of the universe to crack

The new approach to quantum simulation, outlined Feb. 5 in a study published in the journal Nature, takes advantage of both these options by blending digital and analog simulations into a single, multi-staged approach.

Simulation theory

This "hybrid" approach begins with a digital simulation layer, where scientists use the flexibility of the system to prepare the initial quantum states of each qubit pair— choosing the most pertinent position to start from. Next, the process switches to analog simulation, which can evolve toward the specific quantum states the scientists want to study.

Finally, the process switches back to a digital simulation to fine-tune and probe the quantum states to solve the most interesting problems in the physics being simulated.

The new research means that quantum computers will likely outperform conventional supercomputers in practical settings within the next five years, Hartmut Neven, the founder and lead of Google Quantum AI, said in an emailed statement. The time estimates vary greatly, with some suggesting this may be as far away as 20 years or achievable in the next couple.

Scientists have already demonstrated that Google's quantum computing chips, including Sycamore and the newly released Willow, can outperform the most powerful supercomputers — but so far only in benchmarking. To achieve supremacy in a practical scenario, the scientists said they must make further improvements in calibration and control accuracy, as well as improving the hardware. They also need to identify problems that both can be solved by quantum simulation and are too complex to address using classical computers.

However, the new hybrid research enables today's quantum computers to boost the capabilities of the fastest supercomputers. And this hybrid approach is already being harnessed to make new scientific discoveries, which the Google scientists achieved in testing their new approach. For example, in the behavior of magnets, the Google scientists addressed questions on how a magnet behaves when it's cooled to extremely low temperatures, and how energy flows from a hot to a cold part.

The hybrid approach was also used to show that the Kibble-Zurek mechanism (KZM) — a widely regarded model that predicts where defects form in a material — did not always hold true. Instead, the new hybrid simulation revealed entirely new physics. This is an example of the kind of discoveries that the hybrid approach quantum simulation can address, the scientists said.

]]>
https://www.livescience.com/technology/computing/quantum-simulation-breakthrough-will-lead-to-discoveries-impossible-in-todays-fastest-supercomputers-google-scientists-claim tYCGEvdLhgjuPD2RoZjXJN Tue, 18 Feb 2025 12:25:00 +0000
<![CDATA[ Surprisingly simple coding trick can slash data center energy usage by 30% ]]> Researchers in Canada have discovered a method to reduce the energy that some data centers consume by as much as 30%.

In 2022, the global electricity consumption by data centers was estimated to be between 240 and 340 terawatt-hours, according to the International Energy Agency (IEA). This is between two and three times as much as cryptocurrency mining, while computing as a whole is responsible for 5% of all energy consumption around the world, the scientists said.

What’s more, data center energy consumption is expected to grow further, according to Goldman Sachs, driven by the exponential growth of artificial intelligence (AI).

But researchers at Waterloo University say they have developed a low-cost and simple solution that will cut consumption by almost one-third — and it centered on adding just 30 lines of new code to the Linux operating system

Improving packet allocation

Nearly all web traffic is routed through data centers, the majority of which use the open source operating system Linux. Information arrives in "packets”, which are then distributed and allocated by the data center’s "front end," Martin Kersten, professor of computer science at the University of Waterloo, explained Jan. 20 in a statement.

Related: The 9 most powerful supercomputers in the world right now

Karsten and the study's co-author, computer science graduate student Peter Cai, devised a small change to make data processing more efficient. The method was first outlined in a study presented in December 2023 in the journal Proceedings of the ACM on Measurement and Analysis of Computing Systems (POMACS) — but the code itself was published this month as part of Linux version 6.13.

"We rearranged what is done and when, which leads to much better usage of the data center’s CPU caches. It’s kind of like rearranging the pipeline at a manufacturing plant, so that you don’t have people running around all the time," Karsten said in the statement.

He teamed up with Joe Damato, distinguished engineer at Fastly, the cloud computing services provider, to develop a small section of code — approximately 30 lines — that would improve Linux’s network traffic processing.

The method identifies and quantifies the direct and indirect costs of asynchronous hardware interrupt requests (IRQ), the process by which packets are allocated, as a major source of overhead. It also proposes that a small modification of the Linux system would significantly improve the efficiency and performance of traditional kernel-based networking by up to 45%, without compromising operational effectiveness.

"All these big companies — Amazon, Google, Meta — use Linux in some capacity, but they’re very picky about how they decide to use it," said Karsten in the statement. "If they choose to 'switch on' our method in their data centers, it could save gigawatt hours of energy worldwide. Almost every single service request that happens on the Internet could be positively affected by this."

]]>
https://www.livescience.com/technology/computing/surprisingly-simple-coding-trick-can-slash-data-center-energy-usage-by-30-percent WPxgCbdWnJCz8V4vBWChPh Mon, 17 Feb 2025 13:00:00 +0000
<![CDATA[ 'Next generation of laptops': Intel unveils blueprints for a fully repairable and modular computer ]]> Reference designs for a new kind of modular computer could be the gateway toward laptops and mini-PCs that are easily repairable and reduce electronic waste (e-waste), Intel engineers hope.

The computing giant has shown off its ambitions to create a new PC architecture for laptops that revolves around a motherboard split into several modules — comprising a core mainboard and separate I/O modules for handling things like connectivity. This differs from the all-in-one design commonly found in most laptops. The engineers published their blueprints on Jan. 22 in a blog post.

Splitting a traditional motherboard into multiple components creates a scalable design that enables the reuse of components in laptops of different sizes and layouts. This modular approach means you can swap out faulty motherboard components rather than needing to replace the entire mainboard. Adopting a standardized modular design also means laptop makers could cut manufacturing costs and reduce waste.

Related: Nvidia's mini 'desktop supercomputer' is 1,000 times more powerful than a laptop — and it can fit in your bag

"By developing a new approach to system design that allows for easy upgrades and component replacements, we aim to significantly extend the usable life of computing devices, thereby reducing electronic waste and promoting a more sustainable consumption model," three Intel representatives jointly wrote in the blog post.

Illustration created by Intel showing compartments of the new laptop.

Intel is reimagining modular laptops from the core up. (Image credit: Intel )

Modular moves

The crux of modular laptops is that if a component breaks or needs to be upgraded, it can be easily swapped out by the user or a skilled technician without needing to replace other parts of the machine or be sent back to the factory.

This has been the case with some parts of Panasonic's ToughBook laptops and the Framework Laptop 13, a highly modular laptop with a design that enables multiple components to be swapped in and out, and hardware that’s easily accessed — not glued down, as is the case with other laptop brands.

However, such modularity is far from standardized. Unfortunately, as good as the entries on our list of the best laptops for coding and programming and best laptops for students are, these machines lack many, if any, upgradable components. If they break, they often need specialist repair.

Intel’s "Modular Architecture Blueprint" aims to change this by taking modularity further than Framework’s laptops and delivering a reference design that addresses modularity from the manufacturing stage, to field repairs and user upgrades. Add in easily swappable memory, storage and Wi-Fi components — still not commonly replaceable in mainstream laptops — and Intel could usher in a laptop design that facilitates more customization for hardware makers yet doesn't stymie DIY repairs.

Intel is also looking to achieve the same with mini-PCs, only with hot-swappable M.2 modules that allow for key components like the graphics processing unit and processor to be easily swapped in a "plug and play" approach. While full-size and compact desktop PCs have always been modular, with the ability to swap main components in and out, the compact nature of mini-PCs makes such modularity a harder proposition.

If Intel can ship these reference designs to computer makers, it could usher in more laptops and mini-PCs that can be upgraded and repaired by users, while also reducing the amount of e-waste that is produced.

]]>
https://www.livescience.com/technology/computing/next-generation-of-laptops-intel-unveils-blueprints-for-a-fully-repairable-and-modular-computer CgmonVdEDLThM3n7jVV86R Fri, 14 Feb 2025 13:30:00 +0000
<![CDATA[ Supercomputer runs largest and most complicated simulation of the universe ever ]]> The potential for our understanding of the universe has taken a giant leap forward after Frontier, a supercomputer based in the Oak Ridge National Laboratory (ORNL), created a simulation of the universe at a scale never before achieved.

Frontier used a software platform called the Hardware/Hybrid Accelerated Cosmology Code (HACC) as part of ExaSky, a project that formed part of the U.S. Department of Energy's (DOE) $1.8 billion Exascale Computing Project — the largest software R&D initiative backed by the DOE.

Under Exasky, scientific applications were required to run up to 50 times faster than previous benchmarks, but Frontier and HACC quickly raced ahead of expectations — running almost 300 times faster than similar simulations of Saturn's moon Titan. The DoE/HACC team had spent seven years since the first simulation enhancing the capabilities on exascale supercomputers like Frontier.

This allowed for hydrodynamic cosmology simulations, a far more computationally intensive computer model that incorporates principles like the expansion of the universe and the influence of dark matter. Previous models only incorporated measures of gravity, gas or plasma.

The power of exascale computing

The simulation, which was conducted in November 2024, used around 9,000 of Frontier's computing nodes, all fitted with AMD Instinct MI250X graphics cards.

Frontier is the second-fastest supercomputer in the world and can hit 1.4 exaFLOPS of power. The performance of supercomputers is measured in floating-point operations per second (FLOPS) — where one floating-point operation is a mathematical calculation.

Related: Nvidia's mini 'desktop supercomputer' is 1,000 times more powerful than a laptop — and it can fit in your bag

Anything capable of more than 999 petaFLOPS (0.9 exaFLOPS) is referred to as an "exascale" supercomputer. The only other machine more powerful than Frontier is El Capitan — which can reach 1.7 exaFLOPS of power.

Beyond simulating the universe, Frontier has been used in other crucial research. In April 2023, scientists built the Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM) — a program that simulated an entire year of global climate data down to a resolution of just over 3km. One of the most complex climate models ever computed, it's now a cornerstone in the analysis of complex interactions between the atmosphere, oceans and land to improve weather predictions and gather higher-fidelity data about climate change.

In materials, Frontier has let designers come up with new substrates and geometries for enhanced-property substances, making them stronger, lighter and corrosion-proof. Its exascale computing capabilities have allowed researchers to model chemical interaction at the molecular scale to predict material behavior. The supercomputer has also been pivotal in finding new materials for energy storage, transport, manufacturing and nuclear medicine.

However, scientists are particularly excited about how exascale computing can supercharge artificial intelligence (AI). The speed of these machines lets programmers iterate algorithms and analyze large datasets rapidly, especially in tasks like faster large language models or the application of supercomputers to climate models and climate change prediction.

]]>
https://www.livescience.com/technology/computing/supercomputer-runs-largest-and-most-complicated-simulation-of-the-universe-ever a3mMNWy58D6EJKsSup9cSK Thu, 13 Feb 2025 12:00:00 +0000
<![CDATA[ World's 1st hybrid quantum supercomputer goes online in Japan ]]> Engineers in Japan have switched on the world's first hybrid quantum supercomputer.

The 20-qubit quantum computer, called Reimei, has been integrated into Fugaku — the world's sixth-fastest supercomputer. The hybrid platform will work to tackle calculations that can take classical supercomputers much longer to process.

The machine, which is housed at the Riken scientific institute in Saitama, near Tokyo, will be used primarily for physics and chemistry research, representatives from Quantinuum, the makers of Reimei, and Riken said in a joint statement.

Quantum computers could one day overtake classical computers, with the potential to complete calculations in minutes or seconds that would otherwise take today's most powerful machines millions of years. However, until quantum computers are large and reliable enough, scientists say that integrating their capabilities into supercomputers can be a stopgap.

Unlike most quantum computers that use superconducting qubits, Reimei uses trapped-ion qubits. This involves isolating charged atoms, or ions, in an electromagnetic field — known as an ion trap — and using lasers to precisely control their quantum state.

Related: Google's Sycamore quantum computer chip can now outperform the fastest supercomputers, new study suggests

This enables the scientists to manipulate the ions so they can be used as qubits that store and process quantum information. Trapped ion qubits encourage more connections between qubits and longer coherence times, whereas superconducting qubits have faster gate connections and are easier to fabricate on chips.

Riken representatives said they chose Quintinuum's quantum computer for the integration because it has a unique architecture that physically moves qubits. This process of "ion shuttling" allows qubits to be moved around a circuit as required, allowing for more complex algorithms.

Error-correcting system

Qubits are inherently "noisy," so to effectively scale up quantum computers, scientists are developing error-correction techniques to increase the fidelity of qubits.

In Reimei, the physical ion qubits have been grouped to create "logical qubits" — meaning a set of physical qubits that store the same information in several places. Logical qubits are a key route to achieving a desired reduction in qubit errors, because distributing the information in different places spreads out the points of failure, meaning a qubit failure does not disrupt an ongoing calculation.

Quaintinuum previously achieved a breakthrough in creating a logical qubit with an error rate 800 times lower than physical qubits, which it integrated into its quantum computing processors.

While Reimei-Fugaku is the first fully operational, integrated hybrid system, other companies have previously tested such systems. In June 2024, IQM integrated a 20-qubit quantum processor into the SuperMUC-NG supercomputer in Garching, Germany.

That system, however, is still in the testing phase, with no confirmed public date when it becomes fully operational. In October, IQM representatives announced the company would integrate a 54-qubit system into the supercomputer in the latter half of 2025 followed by a 150-qubit chip in 2026.

]]>
https://www.livescience.com/technology/computing/worlds-1st-hybrid-quantum-supercomputer-goes-online-in-japan 3Mwt2E3sXbKcqkv2djHQjb Tue, 11 Feb 2025 17:00:00 +0000
<![CDATA[ Coldest-ever qubits could lead to faster quantum computers ]]> A new type of autonomous quantum refrigerator could give quantum computers "a major performance boost" and make them more reliable, scientists say.

In a study published Jan. 9 in the journal Nature Physics, researchers successfully cooled a qubit to just 22 millikelvin (minus 459.63 degrees Fahrenheit, or minus 273.13 degrees Celsius) using a quantum refrigerator powered by "thermal baths" of microwave radiation. This is the lowest temperature that qubits have ever reached.

"This paves the way for more reliable and error-free quantum computations that require less hardware overload," study lead author Aamir Ali, research specialist in quantum technology at Chalmers University of Technology in Sweden, said in a statement.

Quantum computers need to be cooled to extremely low temperatures so scientists can tap into delicate quantum properties and perform calculations — as even the smallest environmental disturbance can "flip" their quantum state, causing errors. This is crucial for superconducting qubits — used in the likes of IBM's 1,000-qubit Condor chip — which need to operate at temperatures close to absolute zero (0 K, minus 459.67 F or minus 273.15 C) to maintain stability.

Cooling qubits to near absolute zero places them in their lowest possible energy state, otherwise known as their "ground state". In this state, qubits are likely to retain their quantum properties long enough to perform calculations accurately.

The new system complements conventional dilution refrigerators — which use helium gases to absorb heat through a dilution process and can bring qubits down to around 50 mK — by cooling qubits further, rather than replacing them altogether.

It does this by harnessing energy from reservoirs of heat created using microwave radiation, which is then directed into one of the quantum refrigerator's two qubits.

“Energy from the thermal environment, channeled through one of the quantum refrigerator’s two qubits, pumps heat from the target qubit into the quantum refrigerator’s second qubit, which is cold. That cold qubit is thermalized to a cold environment, into which the target qubit’s heat is ultimately dumped,” study co-author Nicole Yunger Halpern, adjunct assistant professor of physics and IPST at the University of Maryland, said in the statement.

Using this method, the scientists increased the likelihood that the qubit would be in its ground state before a computation to 99.97%.

Ali said this compares to probabilities between 99.8% and 99.92% achieved with previous techniques. "This might seem like a small difference, but when performing multiple computations, it compounds into a major performance boost in the efficiency of quantum computers," he added.

And unlike quantum dilution refrigerators, which are extremely complex and difficult to scale, the new, thermally-driven system is autonomous, meaning it doesn't require external control once started. The findings surpassed the researchers' initial expectations.

"Our work is arguably the first demonstration of an autonomous quantum thermal machine executing a practically useful task," study co-author Simone Gasparinetti, associate professor in quantum technology at Chalmers University of Technology, added in the statement. "We initially saw this as a proof of concept, so we were pleasantly surprised to find its performance surpasses all existing reset protocols for cooling qubits to record-low temperatures."

]]>
https://www.livescience.com/technology/computing/coldest-ever-qubits-could-lead-to-faster-quantum-computers wFKntdeu7KEjgDSn8Nd8Cc Fri, 07 Feb 2025 13:15:00 +0000
<![CDATA[ Newly discovered quantum state could power more stable quantum computers — and a new 2D chip can tap into it ]]> Scientists have discovered a new quantum state that engineers can harness in a two-dimensional (2D) semiconductor chip to control quantum information more reliably than ever before. It provides a promising lead into a new method for extracting quantum information from sub-atomic particles.

Recent advances in ultrathin 2D materials — which are only a molecule thick — have created promising candidates for computer chips that pack much more power into much less space. 2D semiconductors also offer fantastic opportunities for quantum computing.

Quantum entanglement, whereby two subatomic particles can share information over time and space through "coherence", is highly delicate, but essential to processing calculations in parallel, rather than in sequence.

Preventing decoherence — the loss of quantum properties in a subatomic structure — is essential for quantum entanglement to be effective in quantum computers, but 3D structures are highly prone to thermal influences (like heat) or stray electromagnetic waves, and usually collapse within fractions of a second. This is where 2D materials come in.

Maintaining coherence in a 2D material is much easier, as they are less prone to these thermal influences that collapse quantum coherence.

Related: Google 'Willow' quantum chip has solved a problem the best supercomputer would have taken a quadrillion times the age of the universe to crack

Although coherence mechanisms have not yet been well understood in 2D materials, a new study published Oct. 9 in the journal Nano Letters, described how scientists discovered a new quantum state that can maintain longer periods of coherence. They also identified a mechanism causing quantum entanglement in this new quantum state, thus also proposing a method by which quantum information can be controlled and extracted from it.

A never-before-seen quantum state

Specifically, for the first time, they observed the exciton formation process in conjunction with Floquet states. Using photoelectron spectroscopy with a 2D semiconductor, the scientists observed the exciton formation — which occurs when a photon excites an electron into a higher energy state. The exciton is a quasi-particle consisting of an electron and a positively charged hole that are bound together.

A further benefit of 2D materials, over conventional semiconductors, is that an exciton has strong binding energy levels. In quantum systems driven by a time-periodic field (in this case, the driver is short bursts of photons), quasi-stationary states, known as "Floquet states" can occur. These have properties that differ significantly from those of the original undriven systems in an equilibrium state. The new state is a conjunction of these two known conditions.

"We have discovered a new quantum state, known as the exciton-Floquet synthesis state, and proposed a novel mechanism for quantum entanglement and quantum information extraction," Jaedong Lee of Daegu Gyeongbuk Institute of Science and Technology, said in a statement. "This is anticipated to drive forward quantum information technology research in two-dimensional semiconductors."

In the study, the scientists acknowledged the novel quantum states that are transiently formed present a "challenge" for the new applications of 2D semiconducting media, although they did not elaborate on what the main challenge would be in the paper. They are confident, however, that their research promises to pave the way for using 2D semiconductors to create a new type of reconfigurable device to store data in quantum computers.

]]>
https://www.livescience.com/technology/computing/newly-discovered-quantum-state-could-power-more-stable-quantum-computers-by-tapping-into-2d-semiconductor-design fbGPyzXYYQGek8HmjtL3kn Wed, 05 Feb 2025 12:00:10 +0000
<![CDATA[ New laser-based artificial neuron processes enormous data sets at high speed ]]> Scientists have developed a new kind of laser-based artificial neuron that mimics a biological nerve cell. This artificial neuron could boost high-speed computing and artificial intelligence (AI), researchers say.

Artificial neurons mimic nerve cells by activating once they hit a certain information threshold. When a biological neuron takes in enough of the right type of information, it generates an electrical pulse to communicate with nearby neurons. Similarly, artificial neurons process and transmit computational information only once they take in a certain amount of relevant electronic data.

Existing artificial neurons, which are known as photonic spiking neurons, mimic biological spiking neurons by responding to these input signals with all-or-nothing, on-and-off spikes. But the way in which the neurons receive those input signals means that, for a short time after each spike, they cannot respond to new inputs. This brief reset period puts a speed limit on computations performed with artificial spiking neurons.

But the new artificial neurons transmit information via "graded" signals with variable intensity. In the new study, published Dec. 19, 2024 in the journal Optica, researchers used a graded neuron system to surpass spiking neurons' speed limit. Much like a biological graded or "non-spiking" neuron, the laser-based system generated increasingly strong output signals in response to consecutive stimuli, so it didn't need the same reset period as the spiking neurons. As a result, the new artificial neuron transmitted data up to 100,000 times faster than artificial spiking neurons.

The researchers incorporated the graded neuron into a reservoir computing system — a type of artificial neural network which processes time-dependent data. They used this system to scan 700 heartbeat samples for arrhythmia. The reservoir processed these heartbeats at a speed of 100 million heartbeats per second — much faster that spiking neural networks can. The new system detected arrhythmic patterns with more than 98% accuracy. In a separate experiment, the system analyzed and classified handwritten numbers at a rate of nearly 35 million digits per second with 92% accuracy.

"With powerful memory effects and excellent information processing capabilities, a single laser graded neuron can behave like a small neural network," study co-author Chaoran Huang, an engineer at the Chinese University of Hong Kong, said in a statement. "Therefore, even a single laser graded neuron without additional complex connections can perform machine learning tasks with high performance."

Connecting multiple graded neurons could offer even greater computing power. "In this work, we used a single laser graded neuron, but we believe that cascading multiple laser graded neurons will further unlock their potential, just as the brain has billions of neurons working together in networks," Huang said.

"Our technology could accelerate AI decision-making in time-critical applications while maintaining high accuracy," Huang added. "We hope the integration of our technology into edge computing devices — which process data near its source — will facilitate faster and smarter AI systems that better serve real-world applications with reduced energy consumption in the future."

]]>
https://www.livescience.com/technology/artificial-intelligence/new-laser-based-artificial-neuron-processes-enormous-data-sets-at-high-speed ZreHBjEgTDhxwYvRMyaQ2B Tue, 04 Feb 2025 13:00:00 +0000
<![CDATA[ Biological computers could use far less energy than current technology — by working more slowly ]]> Modern computers are a triumph of technology. A single computer chip contains billions of nanometre-scaled transistors that operate extremely reliably and at a rate of millions of operations per second.

However, this high speed and reliability comes at the cost of significant energy consumption: data centres and household IT appliances like computers and smartphones account for around 3% of global electricity demand, and the use of AI is likely to drive even more consumption.

But what if we could redesign the way computers work so that they could perform computation tasks as quickly as today while using far less energy? Here, nature may offer us some potential solutions.

The IBM scientist Rolf Landauer addressed the question of whether we need to spend so much energy on computing tasks in 1961. He came up with the Landauer limit, which states that a single computational task — for example setting a bit, the smallest unit of computer information, to have a value of zero or one — must expend about 10⁻²¹ joules (J) of energy.

Related: History of computers: A brief timeline

This is a very small amount, notwithstanding the many billions of tasks that computers perform. If we could operate computers at such levels, the amount of electricity used in computation and managing waste heat with cooling systems would be of no concern.

However, there is a catch. To perform a bit operation near the Landauer limit, it needs to be carried out infinitely slowly. Computation in any finite time period is predicted to cost an additional amount that is proportional to the rate at which computations are performed. In other words, the faster the computation, the more energy is used.

More recently this has been demonstrated by experiments set up to simulate computational processes: the energy dissipation begins to increase measurably when you carry out more than about one operation per second. Processors that operate at a clock speed of a billion cycles per second, which is typical in today's semiconductors, use about 10⁻¹¹J per bit — about ten billion times more than the Landauer limit.

A solution may be to design computers in a fundamentally different way. The reason that traditional computers work at a very fast rate is that they operate serially, one operation at a time. If instead one could use a very large number of "computers" working in parallel, then each could work much slower.

For example, one could replace a "hare" processor that performs a billion operations in one second by a billion "tortoise" processors, each taking a full second to do their task, at a far lower energy cost per operation. A 2023 paper that I co-authored showed that a computer could then operate near the Landauer limit, using orders of magnitude less energy than today's computers.

Tortoise power

Is it even possible to have billions of independent "computers" working in parallel? Parallel processing on a smaller scale is commonly used already today, for example when around 10,000 graphics processing units or GPUs run at the same time for training artificial intelligence models.

However, this is not done to reduce speed and increase energy efficiency, but rather out of necessity. The limits of heat management make it impossible to further increase the computation power of a single processor, so processors are used in parallel.

An alternative computing system that is much closer to what would be required to approach the Landauer limit is known as network-based biocomputation. It makes use of biological motor proteins, which are tiny machines that help perform mechanical tasks inside cells.

This system involves encoding a computational task into a nanofabricated maze of channels with carefully designed intersections, which are typically made of polymer patterns deposited on silicon wafers. All the possible paths through the maze are explored in parallel by a very large number of long thread-like molecules called biofilaments, which are powered by the motor proteins.

Each filament is just a few nanometres in diameter and about a micrometre long (1,000 nanometres). They each act as an individual "computer", encoding information by its spatial position in the maze.

This architecture is particularly suitable for solving so-called combinatorial problems. These are problems with many possible solutions, such as scheduling tasks, which are computationally very demanding for serial computers. Experiments confirm that such a biocomputer requires between 1,000 and 10,000 times less energy per computation than an electronic processor.

This is possible because biological motor proteins are themselves evolved to use no more energy than needed to perform their task at the required rate. This is typically a few hundred steps per second, a million times slower than transistors.

At present, only small biological computers have been built by researchers to prove the concept. To be competitive with electronic computers in terms of speed and computation, and explore very large numbers of possible solutions in parallel, network-based biocomputation needs to be scaled up.

A detailed analysis shows that this should be possible with current semiconductor technology, and could profit from another great advantage of biomolecules over electrons, namely their ability to carry individual information, for example in the form of a DNA tag.

There are nevertheless numerous obstacles to scaling these machines, including learning how to precisely control each of the biofilaments, reducing their error rates, and integrating them with current technology. If these kinds of challenges can be overcome in the next few years, the resulting processors could solve certain types of challenging computational problems with a massively reduced energy cost.

Neuromorphic computing

Alternatively, it is an interesting exercise to compare the energy use in the human brain. The brain is often hailed as being very energy efficient, using just a few watts — far less than AI models — for operations like breathing or thinking.

Yet it doesn't seem to be the basic physical elements of the brain that save energy. The firing of a synapse, which may be compared to a single computational step, actually uses about the same amount of energy as a transistor requires per bit.

However, the architecture of the brain is very highly interconnected and works fundamentally differently from both electronic processors and network-based biocomputers. So-called neuromorphic computing attempts to emulate this aspect of brain operations, but using novel types of computer hardware as opposed to biocomputing.

It would be very interesting to compare neuromorphic architectures to the Landauer limit to see whether the same kinds of insights from biocomputing could be transferable to here in future. If so, it too could hold the key to a huge leap forward in computer energy-efficiency in the years ahead.

This edited article is republished from The Conversation under a Creative Commons license. Read the original article.

]]>
https://www.livescience.com/technology/computing/biological-computers-could-use-far-less-energy-than-current-technology-by-working-more-slowly NXNqAgweFe2ZUxAmoQQqoN Sun, 02 Feb 2025 12:00:00 +0000
<![CDATA[ World's fastest supercomputer 'El Capitan' goes online — it will be used to secure the US nuclear stockpile and in other classified research ]]> The fastest supercomputer in the world has officially launched at the Lawrence Livermore National Laboratory (LNNL) in California.

The supercomputer, called "El Capitan," cost $600 million to build and will handle various sensitive and classified tasks including securing the U.S. stockpile of nuclear weapons in the absence of underground testing, according to LNNL representatives. This was prohibited in 1992.

Research will primarily be focused on national security, including material discovery, high-energy-density physics, nuclear data and weapon design, as well as other classified tasks.

Construction on the machine began in May 2023, and it came online in November 2024, before being officially dedicated on Jan. 9.

Related: The 9 most powerful supercomputers in the world right now

El Capitan became the world's fastest computer when it became fully operational last year with a score of 1.742 exaFLOPS in the High-Performance Linpack (HPL) benchmark. This is a test used to judge supercomputing speeds all over the world. This makes El Capitan only the third computer ever to reach exascale computing speeds. It has a peak performance of 2.746 exaFLOPS.

El Capitan Supercomputer

(Image credit: Lawrence Livermore National Laboratory (LLNL))

Performance is measured in floating-point operations per second (FLOPS), where one floating-point operation is a mathematical calculation. Although like-for-like comparisons are tricky, the best laptops usually deliver several hundred gigaFLOPS of power — that’s 1 trillion (10^9) FLOPS. An exaFLOP is 1 quintillion (10^18) FLOPS.

The next fastest supercomputer in the world is currently the Frontier supercomputer at Oak Ridge National Laboratory in Tennessee. That supercomputer has achieved a standard performance of 1.353 exaFLOPS with a peak of 2.056 exaFLOPS.

El Capitan is powered by just over 11 million processing and graphics cores packed into 44,544 AMD MI300A accelerated processing units — chips that combine AMD EPYC Genoa CPUs, AMD CDNA3 graphics cards and computing memory — according to Next Platform. Each uses 128 gigabytes of high-bandwidth memory — a special type of computing memory that achieves high speeds while consuming less power — shared across central processing unit and graphics processing unit chiplets.

El Capitan was commissioned by the U.S. Department of Energy's CORAL-2 program to replace the Sierra supercomputer, deployed in 2018. This supercomputer is still in use and was the 14th most powerful supercomputer in the latest Top500 rankings.

]]>
https://www.livescience.com/technology/computing/worlds-fastest-supercomputer-el-capitan-goes-online-used-to-secure-the-u-s-nuclear-stockpile-and-other-classified-research TQCXww9Vec3VkYP7t8Svd6 Tue, 21 Jan 2025 13:00:40 +0000
<![CDATA[ 'ELIZA,' the world's 1st chatbot, was just resurrected from 60-year-old computer code ]]> Scientists have just resurrected "ELIZA," the world's first chatbot, from long-lost computer code — and it still works extremely well.

Using dusty printouts from MIT archives, these "software archaeologists" discovered defunct code that had been lost for 60 years and brought it back to life.

ELIZA was developed in the 1960s by MIT professor Joseph Weizenbaum and named for Eliza Doolittle, the protagonist of the play "Pygmalion," who was taught how to speak like an aristocratic British woman.

As a language model that the user could interact with, ELIZA had a significant impact on today's artificial intelligence (AI), the researchers wrote in a paper posted to the preprint database arXiv Sunday (Jan. 12). The "DOCTOR" script written for ELIZA was programmed to respond to questions as a psychotherapist would. For example, ELIZA would say, "Please tell me your problem." If the user input "Men are all alike," the program would respond, "In what way."

Weizenbaum wrote ELIZA in a now-defunct programming language he invented, called Michigan Algorithm Decoder Symmetric List Processor (MAD-SLIP), but it was almost immediately copied into the language Lisp. With the advent of the early internet, the Lisp version of ELIZA went viral, and the original version became obsolete.

Related: Google's AI tells users to add glue to their pizza, eat rocks and make chlorine gas

Experts thought the original 420-line ELIZA code was lost until 2021, when study co-author Jeff Shrager, a cognitive scientist at Stanford University, and Myles Crowley, an MIT archivist, found it among Weizenbaum's papers.

"I have a particular interest in how early AI pioneers thought," Shrager told Live Science in an email. "Having computer scientists' code is as close to having a record of their thoughts, and as ELIZA was — and remains, for better or for worse — a touchstone of early AI, I want to know what was in his mind." But why the team wanted to get ELIZA working is more complex, he said.

"From a technical point of view, we did not even know that the code we had found — the only version ever discovered — actually worked," Shrager said. So they realized they had to try it.

Reanimating ELIZA

Bringing ELIZA back to life was not straightforward. It required the team to clean and debug the code and create an emulator that would approximate the kind of computer that would have run ELIZA in the 1960s. After restoring the code, the team got ELIZA running — for the first time in 60 years — on Dec. 21.

"By making it run, we demonstrated that this was, in fact, a part of the actual ELIZA lineage and that it not only worked, but worked extremely well," Shrager said.

But the team also found a bug in the code, which they elected not to fix. "It would ruin the authenticity of the artifact," Shrager explained, "like fixing a mis-stroke in the original Mona Lisa." The program crashes if the user enters a number, such as “You are 999 today,” they wrote in the study.

Even though it was intended to be a research platform for human-computer communication, "ELIZA was such a novelty at the time that its 'chatbotness' overwhelmed its research purposes," Shrager said.

Related: 32 times artificial intelligence got it catastrophically wrong

That legacy continues today, as ELIZA is often compared to current large-language models (LLMs) and other artificial intelligence.

Even though it does not compare to the abilities of modern LLMs like ChatGPT, "ELIZA is really remarkable when you consider that it was written in 1965," David Berry, a digital humanities professor at the University of Sussex in the U.K. and co-author of the paper, told Live Science in an email. "It can hold its own in a conversation for a while."

One thing ELIZA did better than modern chatbots, Shrager said, is listen. Modern LLMs only try to complete your sentences, whereas ELIZA was programmed to prompt the user to continue a conversation. "That's more like what 'chatting' is than any intentional chatbot since," Shrager said.

"Bringing ELIZA back, one of the most — if not most — famous chatbots in history, opens people's eyes up to the history that is being lost," Berry said. Because the field of computer science is so forward-looking, practitioners tend to consider its history obsolete and don't preserve it.

Berry, though, believes that computing history is also cultural history.

"We need to work harder as a society to keep these traces of the nascent age of computation alive," Berry said, "because if we don't then we will have lost the digital equivalents of the Mona Lisa, Michelangelo's David or the Acropolis."

]]>
https://www.livescience.com/technology/eliza-the-worlds-1st-chatbot-was-just-resurrected-from-60-year-old-computer-code e7xVSwPCrFxuZVREHWKjCC Sat, 18 Jan 2025 05:01:10 +0000
<![CDATA[ Schrödinger's Cat breakthrough could usher in the 'Holy Grail' of quantum computing, making them error-proof ]]> Scientists have used the famed "Schrödinger's cat" thought experiment to come up with a way to remove errors from future quantum computers.

The new method encodes quantum information onto an antimony atom, which has eight possible states that enable data to be more safely stored than in a standard two-state qubit, or quantum bit.

The breakthrough is a vital step in making errors within quantum systems less likely to occur, and, when they do, make them more easily detected and corrected — a key barrier to the development of quantum computers. The researchers published their findings Wednesday (Jan.14) in the journal Nature Physics.

First devised by physicist Erwin Schrödinger in 1925, his thought experiment evocatively describes the weird rules of the quantum world by imagining a cat placed inside an opaque box with a poison vial which has an opening mechanism controlled by radioactive decay — a completely random quantum process.

Related: Quantum computers that are actually useful 1 step closer thanks to new silicon processor that could pack millions of qubits

Until the box is opened and the cat is observed, Schrödinger argued, the rules of quantum mechanics mean that the unfortunate feline will exist in a superposition of states, simultaneously dead and alive.

In the case of a qubit, quantum information relating to the 0 or 1 states of a classical computer can be encoded on the "'spin up"' and "'spin down"' states of an atom — spin being the intrinsic angular momentum of a fundamental particle.

But if noise within a quantum computer causes this spin to suddenly change (as often happens), the quantum state would be lost, producing an error and destroying the information within.

To get around this problem, the researchers behind the new study embedded an antimony atom, which has eight different spin directions, inside a silicon quantum chip. The six additional spin directions of the antimony atom (obtained by the atom’s composite nature that adds multiple individual spins) means that, unlike in a two spin-state system, a single error isn't enough to destroy the encoded information.

"As the proverb goes, a cat has nine lives. One little scratch is not enough to kill it," co-author Benjamin Wilhelm, a doctoral student in electrical engineering and telecommunications at the University of New South Wales (UNSW) in Australia, said in a statement. "Our metaphorical 'cat' has seven lives: it would take seven consecutive errors to turn the '0' into a '1'!"

With this system in place, the researchers say they will now work to demonstrate a method for detecting and correcting errors in their chip, a feat that is considered a "Holy Grail" within the quantum computing field.

"If an error occurs, we detect it straight away, and we can correct it before further errors accumulate. To continue the 'Schrödinger cat' metaphor, it's as if we saw our cat coming home with a big scratch on his face," co-author Andrea Morello, a professor of electrical engineering and quantum physics at UNSW, said in the statement. "He's far from dead, but we know that he got into a fight; we can go and find who caused the fight, before it happens again and our cat gets further injuries."

]]>
https://www.livescience.com/technology/computing/schrodingers-cat-breakthrough-could-usher-in-the-holy-grail-of-quantum-computing-making-them-error-proof JtgVMBwEEoGc4VSWij2xrK Thu, 16 Jan 2025 14:59:47 +0000
<![CDATA[ Nvidia's mini 'desktop supercomputer' is 1,000 times more powerful than a laptop — and it can fit in your bag ]]> LAS VEGAS — Scientists have created a new mini PC that is almost as powerful as a supercomputer but can fit in your bag.

The new device, dubbed "Project Digits," is designed for developers, researchers, students and data scientists who work with artificial intelligence (AI). Its uses include running AI models that would have previously required tapping into massive data centers via the cloud, Nvidia's CEO Jensen Huang announced at CES 2025 in Las Vegas.

Although the product design has not yet been finalized, it will be small enough to fit on your desk or even in your bag.

Nvidia's Project DIGITS

The new device features an Nvidia Blackwell graphics card and an Nvidia Grace processor alongside 128 gigabytes of memory and 4 terabytes of SSD storage. (Image credit: Future/Keumars Afifi-Sabet)

The device is powered by an Nvidia GB10 Grace Blackwell Superchip, which houses separate, linked components on a single chip to reduce the time it takes to move data between them.

The superchip features an Nvidia Blackwell graphics card and an Nvidia Grace processor, packaged with 128 gigabytes of memory and 4 terabytes of SSD storage.

Related: Google 'Willow' quantum chip has solved a problem the best supercomputer would have taken a quadrillion times the age of the universe to crack

Altogether, the device is approximately 1,000 times more powerful than the best laptops.

A mini PC that's 1000-times more powerful than your PC or laptop

"NVIDIA's Project DIGITS enables researchers in robotics, computer vision and autonomous systems to experiment, fine-tune and scale solutions faster than ever — all while fitting on your desk," said Raquel Urtasun, a professor of computer science at the University of Toronto and founder of Waabi, a self-driving car company, in a statement. Waabi uses Nvidia's technology in its fleet of self-driving trucks. "I'm excited to see what breakthroughs Project DIGITS will enable."

Supercomputing power is measured in floating point operations per second (FLOPS). The most powerful supercomputers in the world deliver a little over 1,000 petaFLOPS of power (1 quintillion FLOPS). This makes them 1 million times more powerful than laptops, according to IBM.

Project Digits, by contrast, can provide 1 petaFLOPS of power. It won't match the very best supercomputers, but it's way more powerful than most desktop PCs and the best laptops and can fit into a considerably smaller chassis.

Using this "desktop supercomputer," researchers can run large language models — generative AI tools like ChatGPT — that use up to 200 billion parameters, while two Project DIGITS devices can be connected to achieve 405 billion parameters. For reference, GPT-3.5, which powered the first version of ChatGPT when it launched in November 2022, was approximately 175 billion parameters in size, with each parameter being a variable that controls how a model processes and generates text.

The device's design has not been finalized yet. Nvidia representatives did not share an image of the prototype, but they said it was set to launch in May for approximately $3,000 — with people able to register their interest in buying one.

]]>
https://www.livescience.com/technology/computing/nvidias-mini-desktop-supercomputer-is-1-000-times-more-powerful-than-your-laptop-and-can-fit-in-your-pocket biwqEdYDocGs57yhK7biPW Tue, 07 Jan 2025 04:07:34 +0000
<![CDATA[ Quantum computers that are actually useful 1 step closer thanks to new silicon processor that could pack millions of qubits ]]> Scientists say they have reached "a critical inflection point" after developing a technology that makes silicon-based quantum processors more viable.

Quantum computing company Equal1 has created a quantum processing unit (QPU) that can be built using conventional semiconductor manufacturing processes. This negates the complexity and expense typically involved with producing quantum processors using exotic materials or complicated techniques.

The company has also developed what representatives called "the most complex quantum controller chip developed to date." This can operate at ultra-low temperatures and paves the way for millions of qubits on a single chip — meaning it can handle a huge number of quantum bits of information simultaneously while keeping them stable and accurate for calculations.

By contrast, the most powerful quantum chips today only house qubits in the thousands and are built with superconductors, all requiring cooling to near absolute zero in order to perform quantum computations.

Combined, the new technologies "pave the way for the next phase of quantum computing and demonstrate the fastest way to scaling is to leverage existing silicon infrastructure," Equal1 representatives said in a statement.

Quantum impracticalities

Building quantum chips is a notoriously difficult and expensive process. Unlike regular computer chips, which rely on binary bits to process information as 1s or 0s, quantum chips use qubits, which are grounded in the principles of quantum mechanics.

Qubits have special properties that allow them to exist in multiple states simultaneously — a phenomenon called superposition — and to work together in ways traditional bits cannot through a process called entanglement. The resultant parallel processing enables quantum computers to solve problems far beyond the capabilities of classical systems.

However, qubits are incredibly fragile. They only work when they are kept in a state of coherence, meaning they maintain their quantum state long enough to perform calculations. Coherence is easily disrupted by environmental factors like temperature changes or electromagnetic noise — hence the need for extremely low temperatures, to avoid interference.

Related: Will we ever have quantum laptops?

Typically, quantum chips are also made using exotic or custom-built materials like superconducting metals, which require expensive and complex manufacturing processes. Equal1’s innovation is its use of silicon — one of the most abundant and widely used materials in the semiconductor industry.

Silicon provides a stable environment for qubits, particularly when using a material blend called silicon germanium (SiGe). In a study published Dec. 2 to the preprint database arXiv, Equal1 scientists explained that SiGe combines silicon’s stability with germanium’s ability to enhance electronic performance, making it well-suited for quantum applications. More importantly, SiGe chips can be produced using the same processes and factories that are already used to manufacture traditional computer chips, potentially making quantum processors cheaper and easier to scale.

Equal1 representatives said its SiGe 6-qubit array — which is the part of the chip where qubits are created and controlled — had broken ground in two key areas: the precision of quantum gate operations and the speed at which those operations are performed.

Specifically, the chip demonstrated a single-qubit gate fidelity of 99.4% with an operation speed of 84 nanoseconds and a two-qubit gate fidelity of 98.4% with a speed of 72 nanoseconds. High accuracy, or fidelity, in quantum gates minimizes errors in calculations, while faster gate speeds reduce the risk of qubits losing their quantum properties during operations. These factors determine the accuracy of quantum computations and the ability of qubits to maintain their quantum states long enough to complete complex operations.

"This result demonstrates the massive benefit of silicon qubits — the ability to achieve the performance required for scaling in two key areas — fidelity and speed of quantum gates." Nodar Samkharadze, chief quantum architect at Equal1, said in the statement.

Putting a spin on it

To ensure reliable quantum operations, Equal1’s device uses "spin qubits." Spin qubits encode information in the spin state of an electron. In their study, the scientists said spin qubits are particularly well-suited for integration with silicon because silicon provides a stable environment for electron spins. This reduces the risk of qubits losing their delicate quantum properties due to interference from their surroundings.

Equal1 also developed a quantum controller chip that uses a multi-tile architecture; this design divides a chip into multiple tiles that can operate semi-independently. This architecture is key to scaling quantum systems because it allows control functions to be distributed across the chip, avoiding the bottlenecks that can occur when relying on a single processing unit.

The controller operates at 300 millikelvin — a temperature just above absolute zero — which allows it to effectively manage qubits while maintaining the conditions needed for coherence. Equal1 representatives said the controller also features artificial intelligence (AI)-driven error correction technology, enabling real-time adjustments that maintain the stability and accuracy of quantum operations.

"Today marks a critical inflection point for Equal1 and the quantum computing industry," added Elena Blokhina, the company's chief scientific officer, in the statement. "Equal1 has always believed that silicon is the vehicle to scale quantum computers and today, with these world leading qubit and control chip results, we have taken a major step towards this vision."

]]>
https://www.livescience.com/technology/computing/quantum-computers-that-are-actually-useful-1-step-closer-thanks-to-new-silicon-processor-that-could-pack-millions-of-qubits anYzjK8oAxfY7vCVTtqStT Sat, 04 Jan 2025 15:00:00 +0000
<![CDATA[ Qubits inspired by 'Schrödinger's cat' thought experiment could usher in powerful quantum computers by 2030 ]]> A fault-tolerant quantum computer could be here by 2030, thanks to an invention called the "cat qubit," named after the famous Schrödinger's cat thought experiment, in which a cat locked in a box with a radioactive pellet exists in a superposition of "dead" and "alive" states until the box is opened .

Researchers from the Paris-based quantum technology company Alice & Bob unveiled the roadmap in a white paper published earlier this month.

This new "quantum era" would be realized as soon as scientists build a quantum processing unit (QPU)capable of holding 100 logical qubits. Logical qubits are collections of physical qubits that share the same information to ensure that calculations can continue when a single qubit within the group fails. Because qubits are inherently error-prone — failing at a rate of 1 in 1,000 (versus classical bits, which fail at a rate of 1 in 1 million million) — quantum calculations are often disrupted.

The scientists have already accomplished the first step in this roadmap by developing the cat qubit. Like its doomed namesake, the cat qubit exists in a double superposition of two quantum states simultaneously. More conventional qubits exist in a single superposition, existing as both 0 and 1. One key advantage of a cat qubit is that as you scale up the number of qubits, the number of so-called "bit-flip" errors —where a 0 switches to a 1 or vice versa — decreases dramatically. Other types of errors become more common, but the tradeoff is still worth it.

Crucially, cat qubits are resistant to decoherence — interference from the external environment that causes qubits to lose their quantum properties and lose any useful information they carry.

But to achieve their goal of useful quantum computing by 2030, Alice & Bob scientists have identified four further milestones that need to be reached. These are to build a logical qubit that is capable of error-correction, create the first error-correcting logical gate, otherwise known as a quantum circuit, create a universal set of logical gates and real-time-error correction. Once all those steps are completed, they will need to create a processor that can house 100 high-quality logical qubits.

Although each milestone builds on the one before it, there is much to accomplish within five years.

The white paper from Alice & Bob, doesn't address unexpected setbacks or "unknown unknowns" (often referred to as black swans). Unlike risks, which can be anticipated and accounted for, unknown unknowns are completely unexpected.

And even if a chip capable of holding 100 logical qubits is developed, that does not necessarily mean that the technology would be commercially viable and deployable at scale.

]]>
https://www.livescience.com/technology/computing/qubits-inspired-by-schrodingers-cat-thought-experiment-could-usher-in-powerful-quantum-computers-by-2030 y4tojLWxWwDAw97gV6B2h3 Fri, 03 Jan 2025 12:00:00 +0000
<![CDATA[ What is quantum supremacy? ]]> Quantum computers are expected to solve some problems beyond the reach of the most powerful supercomputers imaginable. Reaching this milestone has been dubbed "quantum supremacy."

But whether quantum supremacy has been achieved yet and what it would mean for the field remain unsettled.

The term "quantum supremacy" was coined in 2012 by John Preskill, a professor of theoretical physics at Caltech, to describe the point at which a quantum computer can do something that a classical one cannot.

Crossing this threshold has become a guiding star for the tech companies that are building large-scale quantum computers. In 2019, in a paper published in the journal Nature, Google became the first to declare it had achieved quantum supremacy. Other groups have made similar claims in recent years.

However, several of these assertions, including Google's, have since been rejected, after researchers developed novel classical algorithms that go toe-to-toe with quantum computers.

In addition, quantum supremacy experiments have focused on problems with no obvious practical applications, suggesting that useful quantum computers could still be some way off, William Fefferman, an assistant professor of computer science at the University of Chicago, told Live Science. Nonetheless, the idea has helped drive progress in the field and will be a crucial springboard toward more powerful machines, he added.

"You need to walk before you can run," Fefferman said. "I don't think anyone has a perfect road map for how to go from achieving quantum advantage in a really decisive way to this next step of solving a useful problem on a near-term quantum computer. But I'm convinced it's the first step in the process."

How quantum supremacy demonstrations have manifested so far

Theoretical computer scientists have discovered several quantum algorithms that can, in principle, solve problems much faster than classical ones. That’s because they can exploit quantum effects like entanglement and superposition to encode data very efficiently and process many more calculations in parallel than a classical computer can. But the number of qubits — the quantum equivalent of bits — required to implement them at sufficient scale to show an advantage is far beyond what's available with today's quantum processors.

As a result, efforts to demonstrate quantum supremacy have focused on highly contrived problems designed to favor the quantum computer. Google's 2019 experiment involved a 54-qubit processor carrying out a series of random operations. Although the output would be fundamentally useless, the researchers estimated that it would take roughly 10,000 years to simulate the process on Oak Ridge National Laboratory's Summit supercomputer, the most powerful classical machine in the world at the time.

That's because the unusual properties of quantum mechanics mean that simulating these systems on a classical computer quickly becomes intractable as they get larger, said Simon Benjamin, a professor of quantum technologies at the University of Oxford. "It's not that quantum computers are mysterious, magical things," he said. "We know the equations that they obey. But as you consider larger ones, it gets tougher and tougher for the classical computer to keep track of these equations."

This is due to the quantum phenomenon of superposition. Whereas a bit in a classical computer can represent only 1 or 0, a qubit can encode a complex mixture of both states at the same time. Crucially, multiple qubits can be in a shared superposition, meaning that a quantum system can represent all possible combinations of qubit values simultaneously.

That means that describing two qubits requires four numbers to cover all possible states of the system, Benjamin explained. And for each additional qubit, the number of classical bits required to represent the quantum computer's state doubles. "Pretty fast we find ourselves getting to big numbers," he said.

To provide an idea of how quickly the problem scales, Benjamin said, a 30-qubit system can be comfortably simulated on a good laptop. By 40 qubits, you would need a university-scale supercomputer, and by around 46 qubits, you'd reach the limits of the world's most powerful classical machines.

However, these estimates refer to the challenge of exactly simulating a perfect quantum system. In reality, today's quantum computers are highly error-prone, which provides shortcuts for classical algorithms. In 2022, a group from the Chinese Academy of Sciences showed that a university-scale supercomputer could simulate Google's 2019 quantum experiment in just hours, in part by sacrificing accuracy for speed.

Why quantum utility is favorable to quantum supremacy

Other quantum supremacy claims have met similar challenges. A group at the University of Science and Technology of China claimed in a 2021 paper that a random sampling operation they carried out on a 144-qubit light-based quantum computer would be beyond any classical machine. But Fefferman said his group has since shown that they can exploit the noise in the system to simulate the experiment in less than an hour. The same approach should be able to simulate a similar quantum supremacy experiment announced by startup Xanadu in 2022, he added.

As far as Fefferman knows, there are two quantum supremacy experiments still standing. In 2023, Google used a 70-qubit processor to extend the company's previous result, and in 2024, Quantinuum claimed to have crossed the milestone with its 56-qubit H2-1 quantum computer. But Fefferman wouldn't be surprised if classical approaches are developed that can quickly simulate these experiments in the future. "I'm not holding my breath," he said.

A definitive achievement of quantum supremacy will require either a significant reduction in quantum hardware's error rates or a better theoretical understanding of what kind of noise classical approaches can exploit to help simulate the behavior of error-prone quantum computers, Fefferman said.

But this back-and-forth between quantum and classical approaches is helping push the field forwards, he added, creating a virtuous cycle that is helping quantum hardware developers understand where they need to improve.

"Because of this cycle, the experiments have improved dramatically," Fefferman said. "And as a theorist coming up with these classical algorithms, I hope that eventually, I'm not able to do it anymore."

While it's uncertain whether quantum supremacy has already been reached, it's clear that we are on the cusp of it, Benjamin said. But it's important to remember that reaching this milestone would be a largely academic and symbolic achievement, as the problems being tackled are of no practical use.

"We're at that threshold, roughly speaking, but it isn't an interesting threshold, because on the other side of it, nothing magic happens," Benjamin said. "Quantum computers don't suddenly become useful."

That's why many in the field are refocusing their efforts on a new goal: demonstrating "quantum utility," or the ability to show a significant speedup over classical computers on a practically useful problem. Some groups, including researchers at IBM, are hopeful that even today's error-prone quantum computers could achieve this in the near term on some specific problems.

Google also recently demonstrated a key milestone in the race to achieve fault-tolerant quantum computing. Its "Willow" quantum processor was the first to remove more errors than were introduced as you scale up the number of physical qubits in a logical qubit. This means exponential error reduction and a possible pathway to error-free quantum computing.

But Benjamin said there is growing consensus in the field that this milestone won't be reached until we have fault-tolerant quantum computers. This will require quantum processors with many more qubits than we have today, he said, as the most well-studied quantum error-correction codes require on the order of 1,000 physical qubits to produce a single fault-tolerant, or logical, qubit.

With today's largest quantum computers having just crossed the 1,000-qubit mark, this is likely still some way off. "I'm optimistic that eventually such a quantum computer will exist, but I'm pessimistic that it will exist in the next five or 10 years," Fefferman said.

]]>
https://www.livescience.com/technology/computing/what-is-quantum-supremacy Nbm5oh5DiXgHmwSViB42cG Fri, 27 Dec 2024 10:00:00 +0000
<![CDATA[ New quantum computing milestone smashes entanglement world record ]]> Researchers have set a new record for quantum entanglement — bringing reliable quantum computers a step closer to reality. The scientists successfully entangled 24 "logical qubits" — low-error quantum bits of information created by combining multiple physical qubits. This is the highest number ever achieved to date.

They also demonstrated that logical qubits can maintain error correction as the number of qubits increases, a crucial step toward larger, more fault-tolerant quantum systems. The researchers detailed their work in a study published Nov. 18 on the preprint database arXiv.

Despite the incredible promise of quantum computing, several key barriers stand in the way of replacing classical computing. One of these obstacles is controlling qubits — the basic units of quantum information — which is extremely difficult.

Unlike the binary 1s and 0s of traditional computer bits, qubits operate on an entirely different set of mechanics — quantum mechanics, to be precise. While qubits can exist as 1s and 0s, they can also exist as both at the same time, a phenomenon known as superposition. This makes measuring qubits a major challenge.

Related: Quantum computers are here — but why do we need them and what will they be used for?

Two other quantum phenomena, coherence and entanglement, throw additional spanners into the works. Coherence is a measure of how long qubits retain the desired state needed to process quantum calculations. Coherence times are usually measured in fractions of a second and can be disrupted by the tiniest of environmental factors.

When qubits lose coherence, they often also lose entanglement — a mechanism whereby the state of one qubit is tied directly to that of another. This loss of coherence and entanglement adversely affects the ability of quantum computers to perform calculations accurately and reliably.

Enter the logical qubit

In recent years, researchers have increasingly focused on logical qubits as a means of overcoming the fragility of physical qubits.

While physical qubits are typically made of charged particles like ions or superconducting circuits, logical qubits are created by encoding quantum information across multiple physical qubits. This architecture provides an error-correction system, so that if one qubit becomes unstable or loses information, the other qubits can detect and correct it.

The scientists successfully entangled their record-breaking 24 logical qubits using Atom Computing's "neutral-atom quantum processor," which processes and stores quantum information by manipulating individual atoms with lasers, and Microsoft's "qubit-virtualization system," a software platform that helps manage and stabilize qubits by detecting and correcting errors in real time.

While 24 may not seem like a huge number, the ability to entangle this many logical qubits represents a key milestone toward creating scalable, fault-tolerant quantum systems, the researchers said.

"Fault tolerant quantum computing is essential for being able to solve large computational problems that enable scientific and economic value beyond classical computing, and it requires the integration of multiple advanced technologies and quantum error correction algorithms to provide sufficient reliable computing resources in a sustainable way," Atom representatives said in a statement. "With these results [we] have now demonstrated all of the key ingredients necessary for supporting quantum error correction."

The researchers also showed how logical qubits can perform complex tasks and keep errors in check as quantum computers scale. Using the same Atom system, they created and ran computations on 28 logical qubits, proving that it's possible to maintain error correction as quantum systems get more powerful and complex.

"By coupling our state-of-the-art neutral-atom qubits with Microsoft's qubit-virtualization system, we are now able to offer reliable logical qubits on a commercial quantum machine," Ben Bloom, founder and CEO of Atom Computing, said in a statement. "This system will enable rapid progress in multiple fields including chemistry and materials science."

]]>
https://www.livescience.com/technology/computing/new-quantum-computing-milestone-smashes-entanglement-world-record oVTPnzDu9WEd2Utokum4Ba Tue, 10 Dec 2024 12:30:00 +0000
<![CDATA[ Google 'Willow' quantum chip has solved a problem the best supercomputer would have taken a quadrillion times the age of the universe to crack ]]> Google scientists have created a new quantum processor that, in five minutes, cracked a problem that would have taken the world's best supercomputer 10 septillion years to solve. The breakthrough will allow quantum computers to become less error-prone the bigger they get, achieving a milestone that overcomes a decades-long obstacle.

Quantum computers are inherently "noisy," meaning that, without error-correction technologies, every one in 1,000 qubits — the fundamental building blocks of a quan computer — fails.

It also means coherence times (how long the qubits can remain in a superposition so they can process calculations in parallel) remain short. By contrast, every one in 1 billion billion bits fails in conventional computers.

This high error rate is one of the key barriers to scaling up these machines so they are good enough to perform far better than the fastest supercomputers. This is why research has centered on building quantum computers with better and less error-prone — not simply more — qubits.

Google says its new quantum processing unit (QPU), dubbed "Willow," is the first in the world to achieve results that are "below threshold" — a milestone outlined by computer scientist Peter Shor in a 1995 paper. The team outlined the technology in a study published Dec. 9 in the journal Nature.

Cracking a problem set decades ago

Close up of the Willow chip

The new "Willow" quantum processor is fitted with 105 physical qubits combined with error-correcting technologies that mean the more qubits you add, the more reliable quantum computers can be. (Image credit: Google Quantum AI)

The breakthrough — achieving this "below threshold" milestone — means that errors in a quantum computer will reduce exponentially as you add more physical qubits. It charts a path for scaling up quantum machines in the future.

The technology relies on logical qubits. This is a qubit encoded using a collection of physical qubits in a lattice formation. All the physical qubits in a single logical qubit share the same data, meaning if any qubits fail, calculations continue because the information can still be found within the logical qubit.

The Google scientists built sufficiently reliable qubits for exponential error reduction by making several shifts. They improved calibration protocols, improved machine learning techniques to identify errors, and improved device fabrication methods. Most importantly, they improved coherence times while retaining the ability to tune physical qubits to get the best performance.

Related: Monster 4,400-qubit quantum processor is '25,000 times faster' than its predecessor

"What we've been able to do in quantum error correction is a really important milestone — for the scientific community and for the future of quantum computing — which is [to] show that we can make a system that operates below the quantum error correction threshold," Julian Kelly, Google Quantum AI's director of quantum hardware, told Live Science.

This challenging task requires removing more errors from a system than are introduced. Below this threshold, scientists can scale up a quantum computer to be larger and larger, and errors will continue to decline, Kelly explained.

"This has been an outstanding challenge for 30 years — since the idea of quantum error correction was conceived of in the mid-90s," Kelly said.

Mind-boggling results for quantum computing

The Google researchers tested Willow against the random circuit sampling (RCS) benchmark, which is now a standard metric for assessing quantum computing chips. In these tests, Willow performed a computation in under five minutes that would have taken today's fastest supercomputers 10 septillion years. This is close to a quadrillion times longer than the age of the universe.

The first edition of the Willow QPU can also achieve a coherence time of nearly 100 microseconds — which is five times better than the showing from Google's previous Sycamore chip.

Google first announced that Sycamore had passed the RCS benchmark in 2019, when scientists used the chip to resolve a problem that would have taken a classical supercomputer 10,000 years to calculate. In July, a new quantum computer built by Quantinuum broke that record by 100 times.

Image of Google engineer installing a quantum processor

Google first tested their Sycamore quantum computer against the RCS benchmark in 2019, and processed a calculation that would have taken a supercomputer tens of thousands of years to do. (Image credit: Google Quantum AI)

Then, in October, Google again announced that it had discovered a new "quantum phase" when using Sycamore to process calculations, meaning the best QPUs today can outperform the fastest supercomputers in practical applications for the first time.

"Coherence times are now much higher than they used to be, and we immediately translate into basically lowering all the physical operation error rates by about a factor of two," Kelly said.

"So all the underlying qubits just got better at everything they do by about a factor of two. If you look at the logical error rate between this new processor and Sycamore, there's about a factor of 20 difference — and that comes from scaling up but also pushing below threshold."

Looking beyond "below threshold"

Google scientists are now aiming to demonstrate useful and practical computations for today's quantum chips, rather than relying on benchmarking.

In the past, the team has performed simulations of quantum systems that have led to scientific discoveries and breakthroughs, Kelly told Live Science.

Image of two Google Quantum AI engineers wiring up a quantum processor

To build a "really good" logical qubit, the team needs to stitch together 1,457 physical qubits. (Image credit: Google Quantum AI)

One example includes discovering deviations from the assumed laws of physics. But these results were still within reach of the most powerful classical computers.

Next, the team wants to create a "very, very good logical qubit" with an error rate of one in 1 million. To build this, they would need to stitch together 1,457 physical qubits, they said.

This realm is challenging because it's impossible to get there using just physical hardware — you would need error-correction technology layered on top. The scientists then want to connect logical qubits together to perform better than supercomputers in benchmarking as well as real-world scenarios.

]]>
https://www.livescience.com/technology/computing/google-willow-quantum-computing-chip-solved-a-problem-the-best-supercomputer-taken-a-quadrillion-times-age-of-the-universe-to-crack KSS8xmAUQt8F5khRXgToYX Mon, 09 Dec 2024 16:00:50 +0000
<![CDATA[ World's 1st mechanical qubit uses no light or electronics. It could lead to ultra-precise gravity-sensing tech. ]]> Scientists have created the world's first mechanical qubit: a tiny, moving system that stores quantum information using vibrations instead of electric currents or light.

Qubits are the fundamental units of quantum information. Unlike the bits you'd find in a classical computer, qubits can exist as 0, 1, or a superposition of both, thanks to the weird inner workings of quantum mechanics and entanglement.

Traditionally, these are made from superconducting circuits, charged atoms (ions), or light particles (photons). The new mechanical qubit, however, uses phonons — a type of "quasiparticle" — generated by vibrations within a precisely engineered sapphire crystal.

A quasiparticle is a concept used to describe the behavior and interactions of a group of particles as if they were acting as a single particle. In this case, phonons represent quasiparticles that essentially serve as carriers of vibrational energy.

The breakthrough could pave the way for ultra-sensitive sensor technologies capable of detecting forces like gravity, as well as new methods for maintaining stability in quantum computers for longer periods, the scientists said. They published their study Nov. 14 in the journal Science.

Related: Will we ever have quantum laptops?

Mechanical systems have historically been considered too challenging to be used as qubits because, thanks to the principles of quantum mechanics, they are never completely still. This means there is always residual motion that needs to be accounted for and controlled in order for them to work at the quantum level.

Likewise, mechanical oscillators — devices that store and transfer energy in the form of phonons — are typically subject to harmonic vibrations at evenly spaced energy levels. This is an issue, the scientists explained, because uniform spacing makes it difficult to isolate the two energy states needed to represent the 0 and 1 of a qubit.

"[The challenge] is whether you can make the energy levels unequally spaced enough that you can address two of them without touching the others," study co-author Yiwen Chu, a physicist at ETH Zürich, told Science.

The researchers tackled this problem by creating a "hybrid" system, coupling a sapphire crystal resonator measuring 400 micrometers (0.4 mm) with a superconducting qubit, and tuning the two to interact at slightly offset frequencies. When the resonator and qubit interacted, it blended their quantum states, resulting in unevenly spaced energy levels in the resonator — a phenomenon known as "anharmonicity."

This enabled the researchers to isolate two distinct energy states, effectively turning the resonator into a mechanical qubit.

While the mechanical qubit could hold and manipulate quantum information, the system’s fidelity — a measure of how accurately it performs quantum operations — was recorded as just 60%. By comparison, state-of-the-art superconducting qubits often achieve fidelities above 99%.

Even so, mechanical qubits may offer unique advantages, the scientists said. For instance, they can interact with forces like gravity in ways that other quantum systems cannot, making them promising candidates for the development of highly sensitive quantum sensors.

Mechanical qubits may also be able to store quantum information for longer periods of time, they said. This is critical for maintaining coherence — a measure of how long a system can stay stable and perform calculations using quantum data without interference.

The researchers are now working to link multiple mechanical qubits together to perform basic calculations, which they said would mark a key step toward practical applications for the technology.

]]>
https://www.livescience.com/technology/computing/worlds-1st-mechanical-qubit-uses-no-light-or-electronics-it-could-lead-to-ultra-precise-gravity-sensing-tech kjZiKTepkgQQ6yswmdTVwD Sat, 07 Dec 2024 12:00:00 +0000
<![CDATA[ 'Accidental discovery' creates candidate for universal memory — a weird semiconductor that consumes a billion times less power ]]> Scientists may have accidentally overcome a major barrier to smoothening the adoption of next-generation data-storage technologies.

Using a unique material called indium selenide (In2Se3), researchers say they discovered a technique for lowering the energy requirements of phase-change memory (PCM) — a technology capable of storing data without a constant power supply — by up to 1 billion times.

The breakthrough is a step toward overcoming one of the biggest challenges in PCM data storage, potentially paving the way for low-power memory devices and electronics, the researchers said in a study published Nov. 6 in the journal Nature.

PCM is a leading candidate for universal memory — computing memory that can replace both short-term memory like random access memory (RAM) and storage devices like solid-state drives (SSDs) or hard drives. RAM is fast but needs significant physical space and a constant power supply to run, while SSDs or hard drives are much denser and can store data while computers are turned off. Universal memory combines the best of both.

It works by toggling materials between two states: crystalline, where atoms are neatly ordered, and amorphous, where atoms are randomly arranged. These states correlate to binary 1s and 0s, encoding data via switches in states.

However, the "melt-quench technique" used to toggle these states — which involves heating and rapidly cooling PCM materials — requires significant energy, making the technology expensive and difficult to scale. In their study, the researchers found a way to bypass the melt-quench process entirely by instead inducing amorphization through an electrical charge. This slashes PCM's energy requirements and potentially opens the door to broader commercial applications.

"One of the reasons why phase-change memory devices haven't reached widespread use is due to the energy required," study author Ritesh Agarwal, a professor of materials science and engineering at Penn Engineering, said in a statement. The potential of these findings for designing low-power memory devices is "tremendous," he said.

Related: Unique transistor 'could change the world of electronics' thanks to nanosecond-scale switching speeds and refusal to wear out

The researchers' discovery hinges on the unique properties of indium selenide, a semiconductor material with both "ferroelectric" and "piezoelectric" characteristics. Ferroelectric materials can spontaneously polarize, meaning they can generate an internal electric field without needing an external charge. Piezoelectric materials, by contrast, physically deform when they are exposed to an electric charge.

While testing the material, the researchers observed that sections of it amorphized when they were exposed to a continuous current. What's more, this happened entirely by chance.

"I actually thought I might have damaged the wires, study co-author Gaurav Modi, a former doctoral student in materials science and engineering at Penn Engineering, said in the statement. "Normally, you would need electrical pulses to induce any kind of amorphization, and here a continuous current had disrupted the crystalline structure, which shouldn't have happened."

Further analysis revealed a chain reaction triggered by the semiconductor's properties. This begins with tiny deformations in the material caused by the current that triggers an "acoustic jerk" — a sound wave similar to seismic activity during an earthquake. This then travels through the material, spreading amorphization across micrometer-scale regions in a mechanism the researchers likened to an avalanche gathering momentum.

The researchers explained that various properties of indium selenide — including its two-dimensional structure, ferroelectricity and piezoelectricity — work together to enable an ultra-low-energy pathway for amorphization triggered by shocks. This could lay the groundwork for future research around "new materials and devices for low-power electronic and photonic applications," they wrote in the study.

"This opens up a new field on the structural transformations that can happen in a material when all these properties come together," Agarwal said in the statement.

]]>
https://www.livescience.com/technology/computing/accidental-discovery-creates-candidate-for-universal-memory-a-weird-semiconductor-that-consumes-a-billion-times-less-power gcDW5jrrvKHcpJVLXJFCdV Wed, 04 Dec 2024 12:00:00 +0000
<![CDATA[ 1st-of-its-kind cryogenic transistor is 1,000 times more efficient and could lead to much more powerful quantum computers ]]> A new type of transistor can dissipate almost zero heat — slashing energy usage in future quantum computers by up to 1,000 times and paving the way for massively scaled-up machines.

The engineers who created the device say it's the world's first transistor capable of functioning efficiently in cryogenic conditions — extremely low temperatures below -238 degrees Fahrenheit (-150 degrees Celsius).

It performs optimally at temperatures of 1 kelvin and lower — close to absolute zero, they explained in a study uploaded to the preprint database arXiv Oct. 1. (The study has not been peer-reviewed.)

Quantum computers need to be cooled to near-absolute zero for the qubits that power them to reach a state of "coherence," where they occupy a superposition of 1 and 0, the conventional bits of binary data. When you entangle qubits — link them over time and space so they share information — quantum computers can process calculations in parallel, whereas classical computers must process them in sequence one by one.

Related: Unique transistor 'could change the world of electronics' thanks to nanosecond-scale switching speeds and refusal to wear out

Conventional components perform incredibly inefficiently at these sub-freezing temperatures, the scientists said. They're also very hard to maintain — as more and more qubits are added to a system, the more heat is emitted, which makes it more difficult and expensive to sustain these ultralow temperatures.

Because the new transistor — dubbed the "cryo-CMOS transistor" — is optimized to operate at temperatures under 1 K and emit near-zero heat, it offers plenty of advantages over traditional electronics, representatives of the Finnish company SemiQon, which developed the transistor, said in a statement.

It cuts heat dissipation by 1,000 times and consumes 0.1% of the power of traditional transistors. This allows control and readout electronics to be placed directly into the "cryostat" — a gigantic barrel responsible for the cooling — for the first time. It means that future machines can be scaled up far more cost-effectively and with fewer errors that disrupt calculations.

“It was clear to us and others in the scientific community that a transistor which can operate efficiently at ultra-low temperatures would offer substantial value to users in the advanced computing sector and wherever these devices are required to function in cryogenic conditions," Himadri Majumdar, SemiQon's CEO and co-founder, said in the statement.

Beyond quantum applications, the transistors could be used in high-performance computing, like in the world's fastest supercomputers and in space, company representatives said.

]]>
https://www.livescience.com/technology/computing/1st-of-its-kind-cryogenic-transistor-is-1-000-times-more-efficient-and-could-lead-to-much-more-powerful-quantum-computers 7Fa6GWrxZbVzZxzDksFdQm Tue, 26 Nov 2024 10:11:14 +0000
<![CDATA[ 'Quantum hard drives' closer to reality after scientists resolve 10-year-old problem ]]> Scientists say they have cracked a decade-old problem that could bring the concept of a "quantum hard drive" closer to reality.

The solution involved developing a new type of error-correction system for stabilizing qubits — the building blocks of quantum information — against interference, overcoming a major hurdle facing the development of practical quantum computers.

If successfully scaled, the technique could pave the way for highly efficient quantum memory systems capable of storing huge volumes of quantum data, researchers claimed in a new study published Nov. 4 in the journal Nature Communications.

"This advance is crucial for the development of scalable quantum computers, as it allows for a more compact construction of quantum memory systems," the researchers said in a statement. "By reducing the physical qubit overhead, the findings pave the way for the creation of a more compact 'quantum hard drive' — an efficient quantum memory system capable of storing vast amounts of quantum information reliably."

Related: Will we ever have quantum laptops?

One of the biggest challenges in quantum computing lies in managing errors that disrupt calculations.

Quantum computers rely on qubits, tiny units of quantum information akin to bits in classical computers, that are incredibly sensitive to environmental disturbances like temperature changes and electromagnetic interference. Even minuscule disruptions to a qubit’s delicate quantum state can result in lost data and errors in quantum systems.

For years, researchers have worked on ways to keep these qubits, and the quantum data they hold, stable. Error correction in quantum systems is typically achieved by organizing qubits in a lattice structure that follows a topological "code." The aim is to win an "arms race" by using as few physical qubits as possible to manage errors as they arise, the researchers explained

However, current 3D error-correction methods can only handle errors along a single line of qubits, limiting how much error they can manage as the system grows. The researchers overcame this problem by developing an error-correction architecture that uses a 3D lattice of qubits organized by a topological code that enables errors to be corrected across two-dimensional surfaces within the 3D structure, rather than just in a single dimension.

This structure can handle more errors as the system grows by correcting them over broader, two-dimensional surfaces within the 3D lattice, allowing it to scale more efficiently, the researchers said

"There remain significant barriers to overcome in the development of a universal quantum computer. One of the biggest is that we need to use most of the qubits — quantum switches at the heart of the machines — to suppress the errors that emerge as a matter of course within the technology," lead author Dominic Williamson, researcher at the University of Sydney Nano Institute and School of Physics, said in the statement.

"Our proposed quantum architecture will require fewer qubits to suppress more errors, liberating more for useful quantum processing."

Prof. Stephen Bartlett, quantum theorist and director of the University of Sydney Nano Institute, added in the statement: "This advancement could help transform the way quantum computers are built and operated, making them more accessible and practical for a wide range of applications, from cryptography to complex simulations of quantum many-body systems."

]]>
https://www.livescience.com/technology/computing/quantum-hard-drives-closer-to-reality-after-scientists-resolve-10-year-old-problem 8T4u2YkNzyWSzC7P6hhASd Wed, 20 Nov 2024 12:00:00 +0000
<![CDATA[ Monster 4,400-qubit quantum processor is '25,000 times faster' than its predecessor ]]> D-Wave has completed calibrating and benchmarking its latest processor — a 4,400-plus-qubit behemoth that it claims is 25,000 times faster than its predecessor.

The Advantage2 quantum processing unit (QPU) is designed for complex applications including artificial intelligence (AI), materials science and optimization tasks. In a statement issued Nov. 6, D-Wave representatives said the new chip demonstrated "substantial performance gains" over its existing 5,000-qubit Advantage device, including improved speed and accuracy.

"Recent performance benchmarks demonstrate that the 4,400+ qubit Advantage2 processor is computationally more powerful than the current Advantage system, solving a range of problems — including 3D lattice problems common in materials science — 25,000 times faster," the company said in the statement. "The processor also delivers five times better solutions on problems requiring a high degree of precision. Furthermore, it surpasses the current Advantage system in 99% of tests on satisfiability problems, highlighting its capabilities across a wide range of quantum applications."

3D lattice problems are often used in materials science to model atomic interactions. Faster solutions mean researchers can conduct these simulations more quickly, which supports faster development and testing of new materials.

Related: Radical quantum computing theory could lead to more powerful machines than previously imagined

Boolean satisfiability (SAT) problems, meanwhile, are benchmarks that assess a system’s ability to handle complex decision-making tasks with multiple possible solutions. These tests help gauge the processor’s efficiency in applications like cryptography and logistics, where quickly finding solutions that satisfy multiple rules or conditions is essential.

As well as performance upgrades, D-Wave said its new processor delivers improvements in three key areas: coherence time, energy scale and qubit connectivity.

Coherence time refers to how long qubits — the building blocks of quantum information — can maintain their quantum state without interference. A longer coherence time allows for more stable and accurate calculations, improving the reliability of quantum computations. D-Wave reported that its new chip offers double the coherence time of its previous system.

The Advantage2 also provides a 40% increase in energy scale, company representatives said in the statement, enabling the chip to handle more complex calculations with improved stability. Finally, qubit connectivity — the number of connections each qubit can make with other qubits — has been boosted from 15-way to 20-way, enabling the Advantage2 to tackle larger and more intricate problems than its predecessor.

"Our strategic decision to focus development efforts on enhancing the connectivity and coherence of our next annealing quantum computing system has proven successful," Trevor Lanting, chief development officer at D-Wave, said in the statement. "We’re thrilled with the performance of our recently calibrated processor, and we believe this technology will deliver amazing results for our customers, solving bigger and more complex problems."

]]>
https://www.livescience.com/technology/computing/monster-4-400-qubit-quantum-processor-is-25-000-times-faster-than-its-predecessor Khay22BPdR8m5sCLNQkRaX Mon, 18 Nov 2024 11:00:00 +0000
<![CDATA[ IBM's newest 156-qubit quantum chip can run 50 times faster than its predecessor — equipping it for scientific research ]]> IBM's latest quantum computer is now powerful enough for useful scientific research, scientists say, after the company made significant hardware and software improvements to its quantum system.

The new system is made of two parts: a new 156-qubit quantum processing unit (QPU) called R2 IBM Heron (the second generation of a chip launched last year); and Qiskit — a collection of software tools and algorithms designed to optimize quantum computing performance.

The result is a new system that can perform tasks up to 50 times faster than previous efforts, according to benchmarking data. For reference, in IBM's 2023 quantum utility experiment, published in the journal Nature, its most powerful quantum computer at the time took 122 hours to run workloads in the benchmark. The new system, fitted with the R2 Heron QPU, took just 2.4 hours.

The new quantum computers, based in IBM's data centers based around the world can tackle scientific problems across materials, chemistry, life sciences, high-energy physics and more domains, IBM representatives said in a statement.

Related: New 'gold-plated' superconductor could be the foundation for massively scaled-up quantum computers in the future

"Advances across IBM Quantum hardware and Qiskit are enabling our users to build new algorithms in which advanced quantum and classical supercomputing resources can be knit together to combine their respective strengths," Jay Gambetta, vice president for IBM Quantum, said in the statement.

Next-generation quantum processing

The R2 Heron QPU is fitted with 156 qubits arranged in a heavy-hexagonal lattice — a topological structure that IBM uses for all its quantum processors. This enables the system to reliably execute quantum circuits of up to 5,000 two-qubit gates — which is nearly double the 2,880 two-qubit gates in the 2023 utility experiment, powered by the 127-qubit Eagle QPU.

Two-qubit gates are essential to unlocking the exponential power of a quantum computer — in which the more qubits there are fitted into a system, the more calculations can run in parallel. Single qubit gates allow for individual qubits to rotate or flip their states, while two-qubit gates operating in pairs of qubits tap into the laws of quantum mechanics to enable entanglement between them. While single-qubit gates can function on a basic level, utilizing two-qubit gates can enable a quantum computer to perform far more complex calculations.

The new R2 Heron chip also features "two-level system mitigation," which helps to reduce the impact of disturbances to the qubits interacting with the materials surrounding them. The system also benefits from software improvements to error correction — namely, the use of Qiskit's tensor error network mitigation algorithm (TEM).

Further software improvements, including the launch of the latest generation of the runtime engine, optimizing data movement and the introduction of parametric compiling, mean the new system can run at 150,000 circuit layer operations per second (CLOPS). In comparison, base performance was just 950 CLOPS in 2022 and 37,000 CLOPS earlier this year when optimizing data movement was first introduced.

Quantum-centric supercomputing

IBM representatives claim that the latest developments feed into their vision of developing "quantum-centric" supercomputers — which combine quantum and classical computers to achieve viable results sooner than they would by using only quantum computers. .

This is because hybrid systems can address workloads in parallel, breaking down complex algorithms by assigning parts of the task to the half of the system for which they are best suited. Once these chunks are solved, the software layer seamlessly stitches the problems back together.

An example of quantum-centric supercomputing in action is at RIKEN, a scientific research center in Japan. Using a method known as "Quantum-Selected Configuration Interaction," outlined in a paper published to the arXiv preprint database in 2023, scientists are using quantum hardware to model the electronic structure of iron sulfides.

Scientists at RIKEN have also embarked on a project to build a quantum-high-performance-computing hybrid platform by integrating Fugaku, one of the world's fastest supercomputers, with an on-premises IBM System Two quantum computer powered by the Heron QPU.

]]>
https://www.livescience.com/technology/computing/ibms-newest-156-qubit-quantum-processor-runs-50-times-faster-than-its-predecessor-equipping-it-for-scientific-research TeDcYwg8tNsGwV6NaiYqjc Wed, 13 Nov 2024 14:00:40 +0000
<![CDATA[ New 'gold-plated' superconductor could be the foundation for massively scaled-up quantum computers in the future ]]> A new superconductor material could greatly improve the reliability of quantum computers, scientists say.

The electrical resistance of materials typically decreases as they are cooled. But some materials, called superconductors, maintain a gradually declining electrical resistance until they are cooled to their critical cut-off temperature, at which point their resistance becomes zero. Some types of superconductors, such as topological superconductors, can be used to transmit quantum data.

In a research paper published Aug. 23 in Science Advances, researchers at the University of California, Riverside, combined trigonal tellurium — a non-magnetic material and a type of chiral material (made of molecules that lack mirror-image symmetry) — with a thin film of gold.

They observed that the quantum states at the interface contained well-defined polarization (the quantum state of a subatomic molecule). This could allow the excitations of electrons to be potentially used as quantum bits (qubits) in a quantum computer.

The surface of the gold film became superconducting through the "proximity effect." This effect can occur when a non-superconducting material is placed near a superconductor, which suppresses the critical temperature of the superconductor. Being a chiral material, which cannot mirror its molecular properties, trigonal tellurium's quantum properties cannot be superimposed on its physical mirror image.

"By creating a very clean interface between the chiral material and gold, we developed a two-dimensional interface superconductor," said lead author of the study, Peng Wei, an associate professor of physics and astronomy at the University of California, Riverside, in a statement. "The interface superconductor is unique as it lives in an environment where the energy of the spin is six times more enhanced than those in conventional superconductors."

Related: 'Unbreakable' quantum communication closer to reality thanks to new, exceptionally bright photons

The interface superconductor underwent a transition under a magnetic field and became more robust, the scientists said in the paper This suggests it has transformed into a "triplet superconductor." — a type of superconductor that is more resistant to magnetic fields than conventional superconductors.

They conducted the research in conjunction with the National Institute of Standards and Technology. In earlier work, they demonstrated that thin films of gold and niobium naturally suppress decoherence — the loss of quantum properties due to external environmental interference.

Given its robust quantum qualities and its ability to suppress decoherence, this new superconducting material promises to be ideal for use in quantum computers, the scientists said. Minimizing decoherence within the system is a key challenge, which necessitates extreme measures to isolate the quantum computer from external influences, such as shifts in temperature or electromagnetic interference, as well as the use of error-correcting algorithms to ensure calculations remain accurate.

The superconducting material was an order of magnitude thinner than those used in today’s quantum computers, which may prove useful for producing low-loss microwave resonator components in the future. Microwave resonators, which are an essential part of quantum computers, store and control electrons at microwave frequencies.

High-quality, low-loss microwave resonators are needed to enable more reliable quantum computers, and the scientists said that this new superconducting material is a promising candidate.

Unfortunately, the paper’s authors made no reference to the critical cut-off temperature of the material. If they could avoid decoherence at a warmer temperature, it could be a ground-breaking achievement in quantum computing research. The material properties the researchers demonstrated, however, offer encouragement in building the components needed to suppress decoherence. But how practical the material will be requires further exploration.

]]>
https://www.livescience.com/technology/computing/gold-plated-superconductor-could-be-the-foundation-for-massively-scaled-up-quantum-computers-in-the-future iB35xruR7M5J5BTru2QM3h Tue, 12 Nov 2024 12:30:00 +0000
<![CDATA[ Will we ever have quantum laptops? ]]> Roughly 80 years ago, the world was at war. Under a shroud of secrecy, scientists in the U.K., Germany and the U.S. were creating the first electronic computers. These computers filled rooms, demanded vast quantities of electricity and enabled previously impossible calculations. Few of the people involved could have imagined that decades later, computers orders of magnitude more powerful would fit in a backpack — yet that's exactly what happened.

So, as we sit on the threshold of genuinely useful quantum computing, could we ever see quantum laptops? "I think it's possible," Mario Gely, a quantum computing researcher at the University of Oxford, told Live Science. "It's highly speculative, but I can't think of a fundamental reason why a quantum laptop would not be possible."

Here are some of the steps that will be needed to get there.

Scaling up qubit number

Before scientists can make a quantum laptop, they need to make a useful quantum computer, period. Questions remain over how many qubits — quantum equivalents to digital bits — are needed to create a genuinely useful quantum computer, or one that can solve a range of useful, real-world problems that elude the best superclassical computers. But it's definitely higher than is currently possible.

Stephen Bartlett, a theoretical quantum physicist and director of the University of Sydney's Nano Institute, thinks we could see genuinely useful quantum computers by the end of this decade. "There's a bunch of open scientific challenges, which makes that pathway a bit murky, but we're getting close," Bartlett told Live Science.

Related: What is the largest known prime number?

For instance, newly developed quantum charge-coupled device (QCCD) architecture could be used to make two-dimensional arrays of qubits rather than one-dimensional ones — which would increase the density, and potentially the number, of qubits.

Reducing the errors in quantum computers

But scaling brings another challenge in building a miniature quantum computer: correcting errors, or "noise." "Our existing quantum components are noisy, so we need error correction, and that necessitates a large amount of redundancy," Bartlett said. Scientists need to either reduce errors or build error correction into quantum computers, and that requires even more qubits. Many scientists are trying to solve this problem.

For example, a December 2023 study tried to reduce errors by building a quantum computer with "logical qubits." In another paper, published in April 2024, scientists designed a new type of qubit that behaved like an error-correcting logical qubit. Some scientists have even proposed using photons (light particles) as qubits, including another study that used a laser pulse. According to Peter van Loock, a professor of theoretical quantum optics at Johannes Gutenberg University of Mainz in Germany and co-author of the study, this approach has an "inherent capacity to correct errors".

So if, within a decade or two, powerful and useful quantum computers exist, the next step would be miniaturization.

Two laptops with neon abstract lights behind them

Will quantum computers ever look like laptops? (Image credit: Aitor Diago via Getty Images)

Choosing different types of qubits

But to get really small, quantum computers may need to focus on a different type of qubit than is currently popular. Some of the most advanced quantum computers today — such as those made by IBM and Google — rely on quantum processing units filled with superconducting qubits. But the first quantum laptop probably won't use this technology.

That's because, by their nature, superconducting qubits must be cooled to a fraction above absolute zero — around 20 millikelvin — and that requires filling a room with dilution refrigerators. And companies like IBM aren't trying to get around this size constraint. For example, IBM's current quantum computing roadmap sets out goals that include a 2,000-qubit quantum computer by 2033 — which would fill many rooms rather than one.

Quantum laptops may instead rely on trapped ion qubits, charged particles that exist in multiple states at once and that are suspended using electromagnetic fields, Bartlett and Gely explained. Although trapped ion systems work at room temperature and don't rely on room-sized refrigerators, the lasers they use are gigantic.

"At the moment, our laser system occupies approximately a cubic meter [35 cubic feet]," Gely said. "If we assume that ion traps are the future, then we need the lasers to become smaller."

And lasers must not only shrink but also become more advanced. Current systems are geared to constrain 100 ions. "How many qubits you can control with this volume of laser equipment is unclear," Gely said. "You can control more qubits than we have today, but certainly not the millions of qubits of a fully fledged quantum computer."

However, two recent advances could help with miniaturization. First, future QCCDs could aid miniaturization by increasing qubit density. Second, in July, Stanford researchers created titanium-sapphire lasers that are 10,000 times smaller than the ones they replace.

Related: How does a secure phone line work?

Miniaturization efforts will ramp up

Right now, scientists are focused on making quantum computers more powerful, not on shrinking them. "The drive for miniaturization is not as strong at the moment as the drive for performance, and that mimics the early days of conventional computers when we had mainframes," Bartlett said. "People thought of the most powerful computers as taking up a building. And you know, why would anyone seriously consider carrying one around in your backpack?"

The history of computers suggests quantum computers will roll out first for industrial, military and government applications before shifting to consumers. The apocryphal 1943 quote from Thomas Watson Sr. that there would be a "world market for maybe five computers" springs to mind.

Of course, the world market for PCs and laptops is immense, so could there ever be a similar explosion of demand for quantum PCs and laptops? "The question I always get in my quantum computing classes is, you know, 'When can I play Doom on a quantum computer?'" Bartlett said. "But why would you want to when you can play Doom perfectly well on your computer today?"

Instead, Bartlett suggested there might be "quantum personal apps like finance or something niche around information security" — but the truth is, nobody knows. Gely made the alternative suggestion of a quantum processor sitting alongside a classical processor. "It could be like you have a graphics card, but it would only be useful for certain tasks," Gely said.

It's not yet clear that quantum laptops would be useful for consumers. What experts can say with a high level of confidence is that all of the hardware obstacles — scaling the number of qubits, correcting errors cand miniaturizing components — can be overcome. And yet, a future quantum laptop probably won't play Doom.

]]>
https://www.livescience.com/technology/computing/will-we-ever-have-quantum-laptops FTcBx5LWWcZLbZ89YmzRoS Mon, 04 Nov 2024 10:00:00 +0000
<![CDATA[ New memory chip controlled by light and magnets could one day make AI computing less power-hungry ]]> Researchers have developed a new type of memory cell that can both store information and do high-speed, high-efficiency calculations.

The memory cell enables users to run high-speed computations inside the memory array, researchers reported Oct. 23 in the journal Nature Photonics. The faster processing speeds and low energy consumption could help scale up data centers for artificial intelligence (AI) systems.

"There's a lot of power and a lot of energy being put into scaling up data centers or computing farms that have thousands of GPUs [graphics processing units] that are running simultaneously," study co-author Nathan Youngblood, an electrical and computer engineer at the University of Pittsburgh, told Live Science. "And the solution hasn't necessarily been to make things more efficient. It's just been to buy more and more GPUs and spend more and more power. So if optics can address some of the same problems and do it more efficiently and faster, that would hopefully result in reduced power consumption and higher throughput machine learning systems."

The new cell uses magnetic fields to direct an incoming light signal either clockwise or counterclockwise through a ring-shaped resonator, a component that intensifies light of certain wavelengths, and into one of two output ports. Depending on the intensity of light at each of the output ports, the memory cell can encode a number between zero and one, or between zero and minus one. Unlike traditional memory cells, which only encode values of zero or one in one bit of information, the new cell can encode several non-integer values, allowing it to store up to 3.5 bits per cell.

Related: New 'petabit-scale' optical disc can store as much information as 15,000 DVDs

Those counterclockwise and clockwise light signals are akin to " two runners on a track that are running in opposite directions around the track, and the wind is always in the face of one and to the back of the other. One can go faster than the other," Youngblood said.. "You're comparing the speed at which those two runners are running around the track, and that allows you to basically code both positive and negative numbers."

The numbers that result from this race around the ring resonator could be used to either strengthen or weaken connections between nodes in artificial neural networks, which are machine learning algorithms that process data in ways similar to the human brain. That could help the neural network identify objects in an image, for example, Youngblood said.

Unlike traditional computers, which make calculations in a central processing unit then send results to memory, the new memory cells perform high-speed computations inside the memory array itself. In-memory computing is particularly useful for applications like artificial intelligence that need to process a lot of data very quickly, Youngblood said.

The researchers also demonstrated the endurance of the magneto-optic cells. They ran more than 2 billion write and erase cycles on the cells without observing any degradation in performance, which is a 1,000-fold improvement over past photonic memory technologies, the researchers wrote.Typical flash drives are limited to between 10,000 and 100,000 write and erase cycles, Youngblood said.

In the future, Youngblood and his colleagues hope to put multiple cells onto a computer chip and try more advanced computations.

Eventually, this technology could help mitigate the amount of power needed to run artificial intelligence systems, Youngblood said.

]]>
https://www.livescience.com/technology/artificial-intelligence/new-memory-chip-controlled-by-light-and-magnets-could-one-day-make-ai-computing-less-power-hungry qUK3UxxmoszhycHDJGNJhW Sat, 02 Nov 2024 16:00:00 +0000
<![CDATA[ Quantum computers are here — but why do we need them and what will they be used for? ]]> Technology companies are pouring billions of dollars into quantum computing, despite the technology still being years away from practical applications. So what will future quantum computers be used for — and why are so many experts convinced they will be game-changing?

Building a computer that harnesses the unusual properties of quantum mechanics is an idea that has been in contention since the 1980s. But in the last couple of decades, scientists have made significant strides in building large-scale devices. Now, a host of tech giants from Google to IBM as well as several well-funded startups have invested significant sums into the technology — and they have created several individual machines and quantum processing units (QPUs).

In theory, quantum computers could solve problems that are beyond even the most powerful classical computer. However, there’s broad consensus that such devices will need to become much larger and more reliable before that can happen. Once they do, however, there is hope that the technology will crack a host of currently unsolvable challenges in chemistry, physics, materials science and even machine learning.

"It's not just like a fast classical computer, this is a completely different paradigm,” Norbert Lütkenhaus, executive director of the Institute for Quantum Computing at the University of Waterloo in Canada told Live Science. "Quantum computers can solve some tasks efficiently that classical computers simply cannot do."

The current state-of-the-art

The most fundamental building block of a quantum computer is the qubit — a unit of quantum information that is comparable to a bit in a classical computer, but with the uncanny ability to represent a complex combination of both 0 and 1 simultaneously. Qubits can be implemented on a wide range of different hardware, including superconducting circuits, trapped ions or even photons (light particles).

Today’s largest quantum computers have just crossed the 1,000 qubit mark, but most feature just a few tens or hundreds of qubits. They are far more error-prone than classical computing components due to the extreme sensitivity of quantum states to external noise, which includes temperature changes or stray electromagnetic fields. That means that it's currently difficult to run large quantum programs for long enough to solve practical problems.

Related: Radical quantum computing theory could lead to more powerful machines than previously imagined

That doesn’t mean today’s quantum computers are useless, though, said William Oliver, director of the Center for Quantum Engineering at the Massachusetts Institute of Technology (MIT) in the USA. "What quantum computers are used for today is basically to learn how to make quantum computers bigger, and also to learn how to use quantum computers," he said in an interview with Live Science.

Building ever larger processors provides crucial insight into how to engineer larger, more reliable quantum machines and provides a platform to develop and test novel quantum algorithms. They also let researchers test quantum error-correction schemes, which will be crucial for achieving the technology’s full promise. These typically involve spreading quantum information over multiple physical qubits to create a single "logical qubit," which is far more resilient.

Lütkenhaus said that recent breakthroughs in this area suggest fault-tolerant quantum computing might not be so far off. Several companies including QuEra, Quantinuum and Google have recently demonstrated the ability to generate logical qubits reliably. Scaling up to the thousands, if not millions, of qubits that we need to solve practical problems will take time and a lot of engineering effort, says Lütkenhaus. But once that’s been achieved, a host of exciting applications will come into view.

Where quantum could be a game changer

The secret to quantum computing’s power lies in a quantum phenomenon known as superposition, said Oliver. This allows a quantum system to occupy multiple states simultaneously until it is measured. In a quantum computer, this makes it possible to place the underlying qubits into a superposition representing all potential solutions to a problem.

"As we run the algorithm, the answers that are incorrect are suppressed and the answers that are correct are enhanced," said Oliver. "And so by the end of the calculation, the only surviving answer is the one that we're looking for."

This makes it possible to tackle problems too vast to work through sequentially, as a classical computer would have to, Oliver added. And in certain domains, quantum computers could carry out calculations exponentially faster than their classical cousins as the size of the problem grows.

One of the most obvious applications lies in simulating physical systems, said Oliver, because the world itself is governed by the principles of quantum mechanics. The same strange phenomena that make quantum computers so powerful also make simulating many quantum systems on a classical computer intractable at useful scales. But because they operate on the same principles, quantum computers should be able to model the behavior of a wide range of quantum systems efficiently.

This could have a profound impact on areas like chemistry and materials science where quantum effects play a major role, and could lead to breakthroughs in everything from battery technology to superconductors, catalysts and even pharmaceuticals.

Quantum computers also have some less savory uses. Given enough qubits, an algorithm invented by mathematician Peter Shor in 1994 could crack the encryption that underpins much of today’s internet. Fortunately, researchers have devised new encryption schemes that sidestep this risk, and earlier this year the US National Institute of Standards and Technology (NIST) released new "post-quantum" encryption standards that are already being implemented.

Emerging possibilities of quantum computing

Other applications for quantum computers are, at present, somewhat speculative, said Oliver.

There are hopes the technology could prove useful for optimization, which involves searching for the best solution to a problem with many possible solutions. Lots of practical challenges can be boiled down to optimization processes, from easing traffic flows through a city to finding the best delivery routes for a logistics company. Building the best portfolio of stocks for a specific financial goal could also be a possible application.

So far, though, most quantum optimization algorithms offer less than exponential speed-ups. Because quantum hardware operates much slower than current transistor-based electronics, these modest algorithmic speed advantages can quickly disappear when implemented on a real-world device.

At the same time, progress in quantum algorithms has spurred innovations in classical computing. "As quantum algorithm designers come up with different optimization schemes, our colleagues in computer science advance their algorithms and this advantage that we seem to have ends up evaporating," added Oliver.

Other areas of active research with less clear long-term potential include using quantum computers to search large databases or conduct machine learning, which involves analyzing large amounts of data to discover useful patterns. Speed-ups here are also less than exponential and there is the added problem of translating large amounts of classical data into quantum states that the algorithm can operate on — a slow process that can quickly eat into any computational advantage.

But it is still early days, and there is plenty of scope for algorithmic breakthroughs, said Oliver. The field is still in the process of discovering and developing the building blocks of quantum algorithms — smaller mathematical procedures known as "primitives" that can be combined to solve more complex problems.

"We need to understand how to build quantum algorithms, identify and leverage these program elements, find new ones if they exist, and understand how to put them together to make new algorithms," says Oliver.

This should guide the future development of the field, added Lütkenhaus, and is something companies should bear in mind when making investment decisions. "As we push the field forward, don't focus too early on very specific problems," he said. "We still need to solve many more generic problems and then this can branch off into many applications."

]]>
https://www.livescience.com/technology/computing/quantum-computers-are-here-but-why-do-we-need-them-and-what-will-they-be-used-for kPD4thSfQ7xnTHUNYrWKHY Fri, 01 Nov 2024 12:00:00 +0000
<![CDATA[ 'Quantum CD' could hold up to 1,000 times more data than today's optical disks ]]> Scientists have proposed a new type of data storage device that harnesses the powerful properties of quantum mechanics.

The ultra-high-density optical memory device would consist of numerous memory cells, each containing rare earth elements embedded within a solid material — in this case, magnesium oxide (MgO) crystals. The rare earth elements emit photons, or particles of light, which are absorbed by nearby "quantum defects" — vacancies in the crystal lattice containing unbonded electrons, which become excited by light absorption.

Current optical memory storage methods such as CDs and DVDs are constrained by the diffraction limit of light, meaning a single piece of data stored on the device cannot be smaller than the wavelength of the laser reading and writing the data. However, scientists hypothesized that optical disks could hold more data within the same area by using a technique called "wavelength multiplexing," in which slightly different wavelengths of light are used in combination.

Now, researchers propose that MgO could be interspersed with narrow-band rare earth emitters. These elements emit light at specific wavelengths, which could be densely packed together. The scientists published their findings Aug. 14 in the journal Physical Review Research.

"We worked out the basic physics behind how the transfer of energy between defects could underlie an incredibly efficient optical storage method," study co-author Giulia Galli, a professor at the University of Chicago's Pritzker School of Molecular Engineering, said in a statement.

The study modeled how light spreads at the nanometer scale to understand how energy moves between the rare earth emitters and the quantum defects within the material, as well as how the quantum defects store the captured energy, Galli added.

Scientists already understood how quantum defects in solid materials interact with light. But they had not studied how the quantum defects' behavior changes when the light source is incredibly close, such as narrow-band rare earth emitters embedded a few nanometers (a millionth of a millimeter) away.

The photons are much smaller than conventional laser photons. By way of comparison, the photons from a conventional optical or near-infra-red laser emitter tend to be 500 nm to 1 micrometer (a thousandth of a millimeter). Hence, this new research could lead to data storage devices 1,000 times more dense than previously possible.

The scientists discovered that when the quantum defects absorbed the narrow band of energy emitted from the nearby rare earth elements, they became excited from their ground state and flipped into a spin state. As the spin state transition is hard to reverse, these defects could potentially store data for a useful period — although further work would be required to measure this, the scientist said. Furthermore, narrow-band rare earth emitters generate smaller wavelengths of light, which enables a denser data storage method than other optical approaches.

Most quantum-based technologies operate at near absolute zero, which suppresses decoherence and dephasing — the corruption and loss of information in a quantum system. For technology based on this research to be viable, it would need to operate at room temperature.

"To start applying this to developing optical memory, we still need to answer additional basic questions about how long this excited state remains and how we read out the data," co-author Swarnabha Chattaraj, a postdoctoral researcher at Argonne National Laboratory, said in the statement. ​"But understanding this near-field energy transfer process is a huge first step."

]]>
https://www.livescience.com/technology/computing/quantum-cd-could-hold-up-to-1-000-times-more-data-than-todays-optical-discs Wmiz8Ad4dxSBg84imQX2rJ Fri, 25 Oct 2024 11:00:00 +0000
<![CDATA[ Scientists build the smallest quantum computer in the world — it works at room temperature and you can fit it on your desk ]]> Scientists have built the smallest quantum computer in the world. It is the size of a desktop PC and can work at room temperature.

The machine is powered by just one photon, or light particle, embedded in a ring-shaped optical fiber, the scientists wrote in a study published Sept. 3 in the journal Physical Review Applied. The machine is a proof of concept and can complete mathematical operations such as prime number factorization — such as 15 = 5 x 3.

Many quantum computers and processors, including IBM's 1,000-qubit Condor chip, are built using superconducting qubits. But to tap into the laws of quantum mechanics and calculate using quantum superposition — which allows the qubit to exist in multiple states simultaneously — they must be cooled to near absolute zero. This requires complex equipment that typically takes up at least the size of a room.

Photons have long been proposed as an alternative to superconducting qubits, in a field known as "optical quantum computing." In February, scientists suggested that building qubits from a single laser pulse could let them make a stable quantum computer at room temperature, for example.

Related: 'World's purest silicon' could lead to 1st million-qubit quantum computing chips

In the new study, the scientists built a machine that can process calculations at room temperature. And because it doesn't need to be chilled, it is the size of a typical desktop PC. The quantum computer stores information in "32 time-bins or dimensions" within the wave packet of a single photon, study lead author Chuu, Chih-sung, professor of quantum optics at the Tsing Hua University in Taiwan, said in a translated statement. This is a world record for the number of computing dimensions that can be accessed by a single qubit, he added.

A close-up of the smartest quantum computer in the world

The machine can process mathematical calculations at room temperature and it's small enough to fit on your desk. (Image credit: National Tsing Hua University)

Unlike superconducting qubits, photons can maintain a stable quantum state at room temperature. A quantum machine that uses photons consumes less energy and is cheaper to run than. It is also more efficient to run than systems using trapped-ion qubits — charged particles suspended in free space by electromagnetic waves — which require complex lasers to precisely tune their quantum state.

Optical quantum computers with hundreds of photons already exist. But because photons appear probabilistically — meaning "they are there one second and disappear the next" — they are difficult to corral in large numbers, Chuu said.

Instead, Chuu and his team compressed all the information into one stable photon. He likened this work to transforming a bicycle that can carry one person into a 32-car train that can fit a huge number of passengers. The next steps are to continue improving the storage capacity of a single photon so that it can process even more complex calculations, he added.

Given the machine uses a photon as its qubit, it could easily be integrated into future quantum communication networks that use light to transmit data, or with other light-based classical computing systems, the scientists said.

]]>
https://www.livescience.com/technology/computing/scientists-build-the-smallest-quantum-computer-in-the-world-it-works-at-room-temperature-and-you-can-fit-it-on-your-desk MAuCsQUkwKx7dEtGhvPKXc Wed, 23 Oct 2024 12:00:00 +0000
<![CDATA[ Chinese scientists claim they broke RSA encryption with a quantum computer — but there's a catch ]]> Researchers in China say they've used a quantum computer to break RSA encryption. But that doesn't necessarily mean your emails or WhatsApp messages will be intercepted anytime soon.

Encryption is used to protect sensitive data, like banking information and medical records, when it is transmitted over the internet. RSA — named after its creators, Ron Rivest, Adi Shamir and Leonard Adleman — is a type of encryption, called asymmetric encryption, which uses two different-but-linked keys to solve a mathematical problem.

Encryption has proved to be a successful method for protecting sensitive information, as it requires mathematical computation so complex that it cannot be solved by even the most powerful supercomputers in the world today — unless they have the cryptographic key.

It has long been predicted that quantum computers would make current encryption technology obsolete. Quantum computers can process vast amounts of information in far less time than a conventional computer can. This is because, thanks to the laws of quantum mechanics — and the qubits that power them — they can process calculations in parallel rather than in sequence. In theory, this means that it will take a quantum computer just seconds to solve a problem that would take classical computers millions of years.

Related: Future quantum computers will be no match for 'space encryption' that uses light to beam data around — with the 1st satellite launching in 2025

Quantum computing is a nascent technology, however, and the most powerful quantum machines today have thousands of qubits. And scientists have projected we will need a machine with millions of qubits for it to be more powerful than our most powerful classical computers. Quantum computers also require dedicated laboratories, as well as expensive and complicated infrastructure.

But in a study published in the journal Chinese Journal of Computers in May, researchers found that D-Wave Advantage — a 5,760-qubit machine created by California-based D-Wave Quantum Systems — could break the RSA encryptions they challenged it to solve.

The machine did this through a process called quantum annealing. Quantum annealing uses quantum fluctuations — erratic changes in energy levels in quantum systems — to optimize a problem so it is solved in the easiest way possible.

Although they used a quantum computer to decrypt an RSA encryption, they used only a 50-bit integer for the RSA encryption. Size really does matter in encryption. The strength of an RSA encryption relates to the length of the integer — which defines how big the problem is. For example, a 50-bit integer has 9.67 x 10^16 possible values.

But most modern encryption technologies now use 1024- to 2048-bit integers. A 1024-bit integer has 1.797 x 10^308 possible values, while a 2048-bit integer has 3.231 x 10^616 possible values. Hence, the number of possible values for modern encryption methods are immensely larger — and, therefore, more complex — than the one overcome by the researchers.

The research is an interesting proof of concept that reinforces the expectation that quantum computers can one day decrypt modern encryption technologies. Although not stated in the paper, the natural next steps for research like this will examine how D-Wave Advantage and quantum annealing can cope with encryption models with larger integers, such as 128- or 256-bit integers.

It also signals that quantum computers are coming and will have an impact on security that relies on encryption. That is why scientists are also building post-quantum cryptography technologies — a type of cryptography that uses algorithms that are resistant to being solved by quantum computers. However, like quantum computers, this technology is still years away from full realization.

]]>
https://www.livescience.com/technology/computing/chinese-scientists-claim-they-broke-rsa-encryption-with-a-quantum-computer-but-theres-a-catch wsixfbvtau6BwYuTQ7XJXE Tue, 22 Oct 2024 11:00:10 +0000
<![CDATA[ Google's Sycamore quantum computer chip can now outperform the fastest supercomputers, new study suggests ]]> Quantum computers can outpace our fastest classical computers in very specific areas, a groundbreaking experiment suggests.

Google Quantum AI researchers have discovered a "stable computationally complex phase" that can be achieved with existing quantum processing units (QPUs), also known as quantum processors.

This means that when quantum computers enter this specific "weak noise phase," they can perform computationally complex calculations that outpace the performance of the fastest supercomputers. The research — which was led by Alexis Morvan, a quantum computing researcher at Google — was published Oct. 9 in the journal Nature.

"We are focused on developing practical applications for quantum computers that cannot be done on a classical computer," Google Quantum AI representatives told Live Science in an email. "This research is a significant step in that direction. Our next challenge is to demonstrate a 'beyond classical' application with real-world impact."

However, the data produced by quantum computers is still noisy, meaning they still need to do fairly intensive quantum "error correction" as the number of qubits rises in order for the qubits to remain in the "weak noise phase," they added.

Related: History of quantum computing: 12 key moments that shaped the future of computers

Qubits, which are embedded in QPUs, rely on the principles of quantum mechanics to run calculations in parallel, whereas classical computing bits can only process data in sequence. The more qubits are on a QPU, the more exponentially powerful a machine becomes. Due to these parallel processing capabilities, calculations that would take a classical computer thousands of years to perform could be accomplished by a quantum computer in seconds.

But qubits are "noisy," meaning they are highly sensitive and prone to failure due to interference; approximately 1 in 100 qubits fails, versus 1 in 1 billion, billion bits. Examples include environmental disturbances such as temperature changes, magnetic fields or even radiation from space.

This high error rate means that to achieve "quantum supremacy," you would need extremely proficient error-correction technologies — which do not yet exist — or a quantum computer with millions of qubits. Scaling quantum computers isn't easy, with the most qubits in a single machine today standing at approximately 1,000.

But the new experiment run by Google scientists suggests that quantum computers can withstand the current levels of noise and outperform classical computers in specific calculations. However, error correction may still be required when machines scale up.

The scientists used a method known as random circuit sampling (RCS) to test the fidelity of a 2D grid of superconducting qubits, which are one of the most common types of qubits and made from a superconducting metal suspended in temperatures close to absolute zero. RCS is a benchmark that measures the performance of a quantum computer compared with that of a classical supercomputer, and it's the hardest benchmark to perform on a quantum computer, the scientists said.

The experiments revealed that working qubits can transition between a first phase and a second phase, called a "weak noise phase," by triggering certain conditions. In the experiments, the scientists artificially increased the noise or slowed the spread of quantum correlations. In this second, "weak noise phase," the computation was complex enough that they concluded a quantum computer could outperform a classical computer. They demonstrated this on Google's 67-qubit Sycamore chip.

"This is a waypoint on the journey to get to real-world applications, or beyond classical commercial applications," Google Quantum AI representatives said. "Those applications should not be replicable on a classical computer. Our results within this research is a significant step in that direction. If you cannot win on the RCS benchmark, you cannot win on anything else."

]]>
https://www.livescience.com/technology/computing/googles-sycamore-quantum-computer-chip-can-now-outperform-the-fastest-supercomputers-new-study-suggests fRUGCEPZa9Mh5aAHoRRGRV Thu, 10 Oct 2024 15:50:00 +0000
<![CDATA[ History of quantum computing: 12 key moments that shaped the future of computers ]]> Computers that exploit the weird rules of quantum mechanics may soon crack problems that are unsolvable using existing technology. Today’s machines are still far from achieving that, but the field of quantum computing has made dramatic progress since its inception.

Quantum computing has gone from an academic curiosity to a multi-billion-dollar industry in less than half a century and shows no signs of stopping. Here are 12 of the most important milestones on that journey.

1980: The quantum computer is born

By the 1970s, scientists had begun thinking about potential crossovers between the new fields of quantum mechanics and information theory. But it was American physicist Paul Benioff who crystallized many of these ideas when he published the first-ever description of a quantum computer. He proposed a quantum version of a "Turing machine" — a theoretical model of a computer, devised by renowned British computer scientist Alan Turing, that is capable of implementing any algorithm. By showing that such a device could be described using the equations of quantum mechanics, Benioff laid the foundations for the new field of quantum computing.

1981: Richard Feynman popularizes quantum computing

Both Benioff and legendary physicist Richard Feynman gave talks on quantum computing at the first Physics of Computation Conference in 1981. Feynman’s keynote speech was on the topic of using computers to simulate physics. He pointed out that because the physical world is quantum in nature, simulating it exactly requires computers that similarly operate based on the rules of quantum mechanics. He introduced the concept of a "quantum simulator," which cannot implement any program like a Turing machine, but can be used to simulate quantum mechanical phenomena. The talk is often credited for kick-starting interest in quantum computing as a discipline.

1985: The "universal quantum computer"

One of the foundational concepts in computer science is the idea of the universal Turing machine. Introduced by its namesake in 1936, this is a particular kind of Turing machine that can simulate the behavior of any other Turing machine, allowing it to solve any problem that is computable. However, David Deutsch, a professor in the quantum theory of computation, pointed out in a 1985 paper that because the universal computer described by Turing relied on classical physics, it would be unable to simulate a quantum computer. He reformulated Turing’s work using quantum mechanics to devise a “universal quantum computer,” which is capable of simulating any physical process.

1994: First killer use case for quantum computers

Despite the theoretical promise of quantum computers, researchers had yet to find clear practical applications for the technology. American mathematician Peter Shor became the first to do so when he introduced a quantum algorithm that could efficiently factorize large numbers. Factorization is the process of finding the smallest set of numbers that can be combined to create a larger one. This process becomes increasingly difficult for larger numbers and is the basis for many leading encryption schemes. Shor’s algorithm can solve these problems exponentially faster than classical computers, though, raising fears that quantum computers could be used to crack modern encryption and spurring the development of post-quantum cryptography.

1996: Quantum computing takes on search

It didn’t take long for another promising application to appear. Bell Labs computer scientist Lov Grover proposed a quantum algorithm for unstructured search, which refers to looking for information in databases with no obvious system of organization. This is like looking for the proverbial needle in a haystack and is a common problem in computer science, but even the best classical search algorithms can be slow when faced with large amounts of data. The Grover algorithm, as it has become known, exploits the quantum phenomenon of superposition to dramatically speed up the search process.

1998: First demonstration of a quantum algorithm

Dreaming up quantum algorithms on a blackboard is one thing, but actually implementing them on hardware had proven much harder. In 1998, a team led by IBM researcher Isaac Chuang made a breakthrough when they showed that they could run Grover’s algorithm on a computer featuring two qubits — the quantum equivalent of bits. Just three years later Chuang also led the first implementation of Shor’s algorithm on quantum hardware, factoring the number 15 using a seven-qubit processor.

1999: The birth of the superconducting quantum computer

The fundamental building blocks of a quantum computer, known as qubits, can be implemented on a wide range of different physical systems. But in 1999, physicists at Japanese technology company NEC hit upon an approach that would go on to become the most popular approach to quantum computing today. In a paper in Nature, they showed that they could use superconducting circuits to create qubits, and that they could control these qubits electronically. Superconducting qubits are now used by many of the leading quantum computing companies, including Google and IBM.

2011: First commercial quantum computer released

Despite considerable progress, quantum computing was still primarily an academic discipline. The launch of the first commercially available quantum computer by Canadian company D-Wave in May 2011 heralded the start of the quantum computing industry. The start-up’s D-Wave One featured 128 superconducting qubits and cost roughly $10 million. However, the device wasn’t a universal quantum computer. It used an approach known as quantum annealing to solve a specific kind of optimization problem, and there was little evidence it provided any speed boost compared to classical approaches.

2016: IBM makes quantum computer available over the cloud

While several large technology companies were developing universal quantum computers in-house, most academics and aspiring quantum developers had no way to experiment with the technology. In May 2016, IBM made its five-qubit processor available over the cloud for the first time, allowing people from outside the company to run quantum computing jobs on its hardware. Within two weeks more than 17,000 people had registered for the company’s IBM Quantum Experience service, giving many their first hands-on experience with a quantum computer.

2019: Google claims "quantum supremacy"

Despite theoretical promises of massive "speedup," nobody had yet demonstrated that a quantum processor could solve a problem faster than a classical computer. But in September 2019, news emerged that Google had used 53 qubits to perform a calculation in 200 seconds that it claimed would take a supercomputer roughly 10,000 years to complete. The problem in question had no practical use: Google’s processor simply performed random operations and then researchers calculated how long it would take to simulate this on a classical computer. But the result was hailed as the first example of "quantum supremacy," now more commonly referred to as "quantum advantage."

2022: A classical algorithm punctures supremacy claim

Google’s claim of quantum supremacy was met with skepticism from some corners, in particular from arch-rival IBM, which claimed the speedup was overstated. A group from the Chinese Academy of Sciences and other institutions eventually showed that this was the case, by devising a classical algorithm that could simulate Google’s quantum operations in just 15 hours on 512 GPU chips. They claimed that with access to one of the world’s largest supercomputers, they could have done it in seconds. The news was a reminder that classical computing still has plenty of room for improvement, so quantum advantage is likely to remain a moving target.

2023: QuEra smashes record for most logical qubits

One of the biggest barriers for today’s quantum computers is that the underlying hardware is highly error-prone. Due to the quirks of quantum mechanics, fixing those errors is tricky and it has long been known that it will take many physical qubits to create so-called “logical qubits” that are immune from errors and able to carry out operations reliably. Last December, Harvard researchers working with start-up QuEra smashed records by generating 48 logical qubits at once – 10 times more than anyone had previously achieved. The team was able to run algorithms on these logical qubits, marking a major milestone on the road to fault-tolerant quantum computing.

]]>
https://www.livescience.com/technology/computing/history-of-quantum-computing-key-moments-that-shaped-the-future-of-computing RSnKjHBkdGtuKas7RUkEu4 Mon, 30 Sep 2024 14:00:00 +0000
<![CDATA[ What is a quantum processing unit (QPU)? ]]> One of the core components of a quantum computer is the quantum processing unit (QPU) or quantum processor. Instead of binary bits used in classical computing, quantum computers use quantum bits – or qubits for short. These qubits are subatomic particles that can use the properties of quantum mechanics to represent and process vast amounts of data.

A quantum processor manipulates qubits in order to complete tasks. It is akin to a conventional computer's central processing unit (CPU), which performs calculations using the information held in binary bits – the 1s and 0s of data.

Unlike classical processors, quantum processors use quantum logic gates (or quantum gates) to manipulate qubits and perform calculations. Quantum gates are inherently different from binary logic gates, as they are designed to take advantage of the weird properties of quantum physics. The bizarre rules of the quantum world enable quantum computers to carry out calculations in a mere fraction of the time that it would take normal binary computers.

As the technology is still in its infancy, there is currently no standard architecture or approach to developing a quantum processor, and one quantum processor may operate in a completely different way from another. Because of these differing architectures, it can be hard to compare the capabilities of different types of quantum processors.

How do quantum processors work?

The development of quantum processors still faces significant practical challenges. To be effective and accurate, quantum processors must also maintain qubit stability and have viable error correction systems in place. Both elements are essential for developing quantum computers capable of performing large-scale calculations accurately.

Part of the problem, however, is that qubits are inherently fragile and can be affected by a variety of external environmental conditions. A stable qubit state, which is essential for accuracy, is often (but not exclusively) achieved by using high-powered magnetic fields or by freezing the qubits to near absolute zero.

There are also ongoing investigations into a variety of technologies that can detect unwanted changes in the qubit states, alongside those that correct or compensate for external interferences.

These technical challenges make quantum processors incredibly delicate technologies that are sensitive to the slightest external interference. Even tiny vibrations can disrupt a quantum processor from successfully completing task, meaning they are not yet suitable for applications outside of a laboratory environment.

Classical algorithms can’t be used in quantum physics, so quantum processors use quantum algorithms that enable them to execute code differently. The unique nature of quantum algorithms allows them to take advantage of the quantum properties and process data at a faster rate.

Although quantum processors are able to perform computations too complex for today’s most powerful supercomputers, they are not able to solve undecidable problems such as anything that is fundamentally unsolvable.

The development of quantum processing units is akin to the early evolution of conventional CPUs. Although there are a variety of quantum processor unit architectures, these are likely to become unified into a single standard design as the technology is refined and improved in the coming years.

]]>
https://www.livescience.com/technology/computing/what-is-a-quantum-processing-unit-qpu PabWNqrZyTz4i4rJnRaPdB Mon, 30 Sep 2024 12:00:00 +0000
<![CDATA[ What is a quantum bit (qubit)? ]]> A quantum bit, otherwise known as a qubit, is the basic unit of data in quantum computing. Like a binary bit in classical computers, as it can store information, but behaves very differently thanks to quantum mechanics.

Quantum computers normally use subatomic particles, such as photons (packets of light) or electrons, as qubits. In qubits, properties such as charge, photonic polarization or spin represent the 1s and 0s in binary computing. However, qubits are also subject to phenomena known as superposition and entanglement, due to their quantum nature, which is where things start to get weird.

Bits vs qubits: What's the difference?

As well as being either 0 or 1, like a bit, qubits can occupy both states at the same time — or a superposition of 1 and 0. The qubit will remain in superposition until it is directly observed or disrupted by external environmental factors, such as heat. Because this quantum state is so delicate, qubits have to be kept free from interference, which requires very cold temperatures.

Superposition allows the qubits of a quantum computer to be in multiple states (0, 1 or both) and the number of possible states available grows exponentially the more qubits there are. If you have two classical bits, for example, at any given time they could take the values of either 0,0; 0,1; 1,0; or 1,1.

With two qubits, you can encode data in all four states at once. As such, quantum computers potentially have far greater processing power than conventional computers using binary bits. The more qubits you have, the more calculations you can process in parallel — and this rises exponentially if you add more to the system. However, to see exponential growth in processing power, you must also entangle the qubits.

How does entanglement work?

In quantum entanglement, the states of subatomic particles are linked, regardless of how far apart they may be. Gaining information about a qubit will automatically provide information about its entangled particle.

Entangled particles are always in a correlated state. Consequently, if a property (such as spin) of one particle is measured, thus bringing it out of superposition, the same thing will also instantaneously happen to the entangled particle. Since the states of the two entangled particles are always correlated, knowing the state of one entangled particle means the state of the other can be inferred.

Related: Prototype quantum processor boasts record 99.9% qubit fidelity

Rather than directly measuring the qubit, and thereby causing it to lose its superposition state, scientists are investigating whether there might be a way of indirectly inferring information about a qubit from its interaction with the surrounding environment.

Quantum entanglement of qubits also allows them to interact with each other simultaneously, regardless of their distance from each other. When combined with superposition, quantum entanglement theoretically enables qubits to greatly enhance the computing power of quantum computers, allowing them to perform complex calculations that powerful binary computers would struggle to resolve.

This is currently possible at a small scale, but the challenge is to scale it up. For instance, some calculations, such as breaking encryption algorithms, would take classical computers millions of years to perform. However, if we could build a quantum computer with millions of qubits, those same algorithms could be cracked within seconds.

Why are qubits so fragile and prone to decoherence?

So why haven't we simply stacked up more and more qubits to build such a machine? Unfortunately, qubits are short-lived, and the superposition can collapse with the very faintest of external environmental influences, like heat or movement. For that reason they are deemed "noisy" and error-prone.

For that reason, many qubits need to be chilled to near absolute zero and maintained using specialized equipment. They also have incredibly short "coherence times" — which is the measure of how long they retain the desired state needed to process quantum calculations. Coherence times usually only last fractions of a second. (The world record is 10 minutes for a single qubit — but experts think it's unlikely to be translated to a real quantum computer.) This factor also makes qubits unsuitable for long-term data storage.

Although many quantum computers exist today, we still need to apply "error correction" techniques to qubits to trust their results. One major error correction method under investigation today is building a "logical qubit." A logical qubit is actually a group of entangled, error-prone qubits that store the same information in different places. This spreads out the possible points of failure while a calculation is underway, thereby correcting the errors. Should qubits be stabilized sufficiently, with the superposition and quantum entanglement of qubits in place, quantum computers can one day perform calculations in a fraction of the time that a binary computer would need, as well as solve complex equations that are impossible for even today's most powerful supercomputers.

]]>
https://www.livescience.com/technology/computing/what-is-a-quantum-bit-qubit zUBkaUZj4XmUy2AedGh3WH Wed, 18 Sep 2024 13:00:00 +0000
<![CDATA[ Japan to start building 1st 'zeta-class' supercomputer in 2025, 1,000 times more powerful than today's fastest machines ]]> Japan has announced plans to start constructing the first ever "zeta-class" supercomputer next year. Once fully operational, it will be 1,000 times faster than today's most powerful supercomputers.

The supercharged machine, which could cost more than $750 million to build, will help Japan keep up with the pace of artificial intelligence (AI) development and is expected to be fully online by 2030.

Plans for the new machine — first released on Aug. 28 by Japan's Ministry of Education, Culture, Sports, Science and Technology (MEXT) — reveal that the supercomputer could reach speeds on a zetaFLOPS scale, which has never been achieved before.

Floating-point operations per second (FLOPS) is used to measure how fast computers can solve problems — where one floating-point operation is a single calculation. A supercomputer with a speed of 1 zetaFLOPS could make one sextillion (1 followed by 21 zeros) calculations per second. Today's most powerful supercomputers have only just broken the exaFLOPS barrier, meaning they can make just over one quintillion (1 followed by 18 zeros) calculations per second.

The decision to build such a superpowered machine has been made "in order to keep up with the development of scientific research using AI," Japanese news site Nikkei wrote in a translated article.

Related: Computer inspired by Japanese art of paper-cutting has no electronics and stores data in tiny cubes

The proposed supercomputer is being touted as the successor to Japan's Fugaku supercomputer (0.44 exaFLOPS), which previously held the title of the world's fastest supercomputer until it was dethroned in 2022 by the U.S.'s Frontier supercomputer (1.2 exaFLOPS) at Oak Ridge National Laboratory in Tennessee. Fugaku is currently considered to be the fourth most powerful supercomputer in the world.

The new machine, which is currently being referred to as "Fugaku Next," will be built by Japanese companies RIKEN and Fujitsu, which were both involved in the construction of Fugaku. To allow cross-compatibility between Fugaku and Fugaku Next, the latter will likely use components designed by Fujitsu, according to computing news site Tom's Hardware. However, little else is known about the components that will be fitted into the proposed machine.

One of the biggest challenges engineers will face in building the new supercomputer is finding a way to make it run efficiently. In 2023, computer experts predicted that a zeta-class machine built using current supercomputer technologies would require the equivalent energy to the output of 21 nuclear power plants, computing news website HPCwire previously reported.

MEXT has set aside around ¥4.2 billion ($29 million) for the first year of the project but could allocate up to ¥110 billion ($761 million) throughout the project, which is scheduled to be completed by 2030, according to Tom's Hardware.

As long as construction goes to plan and nobody else builds a zeta-class machine first (which seems highly unlikely), then Fugaku Next will likely be the most powerful supercomputer on Earth.

]]>
https://www.livescience.com/technology/computing/japan-to-start-building-1st-zeta-class-supercomputer-in-2025-1000-times-more-powerful-than-todays-fastest-machines 5mpa4mUMPpdiFnv7ZPBJjb Tue, 10 Sep 2024 10:00:38 +0000