Thursday, October 26, 2023
HomeRoboticsAtom Computing Says Its New Quantum Laptop Has Over 1,000 Qubits

Atom Computing Says Its New Quantum Laptop Has Over 1,000 Qubits


The size of quantum computer systems is rising rapidly. In 2022, IBM took the highest spot with its 433-qubit Osprey chip. Yesterday, Atom Computing introduced they’ve one-upped IBM with a 1,180-qubit impartial atom quantum pc.

The brand new machine runs on a tiny grid of atoms held in place and manipulated by lasers in a vacuum chamber. The corporate’s first 100-qubit prototype was a 10-by-10 grid of strontium atoms. The brand new system is a 35-by-35 grid of ytterbium atoms (proven above). (The machine has area for 1,225 atoms, however Atom has to this point run exams with 1,180.)

Quantum computing researchers are engaged on a variety of qubits—the quantum equal of bits represented by transistors in conventional computing—together with tiny superconducting loops of wire (Google and IBM), trapped ions (IonQ), and photons, amongst others. However Atom Computing and different corporations, like QuEra, consider impartial atoms—that’s, atoms with no electrical cost—have better potential to scale.

It’s because impartial atoms can keep their quantum state longer, they usually’re naturally plentiful and an identical. Superconducting qubits are extra inclined to noise and manufacturing flaws. Impartial atoms will also be packed extra tightly into the identical area as they don’t have any cost that may intervene with neighbors and will be managed wirelessly. And impartial atoms enable for a room-temperature set-up, versus the near-absolute zero temperatures required by different quantum computer systems.

The corporate could also be onto one thing. They’ve now elevated the variety of qubits of their machine by an order of magnitude in simply two years, and consider they will go additional. In a video explaining the expertise, Atom CEO Rob Hays says they see “a path to scale to tens of millions of qubits in lower than a cubic centimeter.”

“We expect that the quantity of problem we needed to face to go from 100 to 1,000 might be considerably larger than the quantity of challenges we’re gonna face when going to no matter we need to go to subsequent—10,000, 100,000,” Atom cofounder and CTO Ben Bloom informed Ars Technica.

However scale isn’t all the things.

Quantum computer systems are extraordinarily finicky. Qubits will be knocked out of quantum states by stray magnetic fields or gasoline particles. The extra this occurs, the much less dependable the calculations. Whereas scaling bought a number of consideration just a few years in the past, the main target has shifted to error-correction in service of scale. Certainly, Atom Computing’s new pc is larger, however not essentially extra highly effective. The entire thing can’t but be used to run a single calculation, for instance, as a result of accumulation of errors because the qubit rely rises.

There was latest motion on this entrance, nonetheless. Earlier this yr, the corporate demonstrated the means to test for errors mid-calculation and probably repair these errors with out disturbing the calculation itself. Additionally they have to hold errors to a minimal general by growing the constancy of their qubits. Recent papers, every displaying encouraging progress in low-error approaches to impartial atom quantum computing, give recent life to the endeavor. Decreasing errors could also be, partly, an engineering drawback that may be solved with higher gear and design.

“The factor that has held again impartial atoms, till these papers have been revealed, have simply been all of the classical stuff we use to manage the impartial atoms,” Bloom stated. “And what that has primarily proven is that in case you can work on the classical stuff—work with engineering corporations, work with laser producers (which is one thing we’re doing)—you’ll be able to really push down all that noise. And now unexpectedly, you’re left with this extremely, extremely pure quantum system.”

Along with error-correction in impartial atom quantum computer systems, IBM introduced this yr they’ve developed error correction codes for quantum computing that would cut back the variety of crucial qubits wanted by an order of magnitude.

Nonetheless, even with error-correction, large-scale, fault-tolerant quantum computer systems will want a whole bunch of hundreds or tens of millions of bodily qubits. And different challenges—similar to how lengthy it takes to maneuver and entangle more and more massive numbers of atoms—exist too. Higher understanding and dealing to resolve these challenges is why Atom Computing is chasing scale similtaneously error-correction.

Within the meantime, the brand new machine can be utilized on smaller issues. Bloom stated if a buyer is all for operating a 50-qubit algorithm—the corporate is aiming to supply the pc to companions subsequent yr—they’d run it a number of occasions utilizing the entire pc to reach at a dependable reply extra rapidly.

In a subject of giants like Google and IBM, it’s spectacular a startup has scaled their machines so rapidly. However Atom Computing’s 1,000-qubit mark isn’t prone to stand alone for lengthy. IBM is planning to finish its 1,121-qubit Condor chip later this yr. The corporate can be pursuing a modular strategy—not in contrast to the multi-chip processors frequent in laptops and telephones—the place scale is achieved by linking many smaller chips.

We’re nonetheless within the nascent levels of quantum computing. The machines are helpful for analysis and experimentation however not sensible issues. A number of approaches making progress in scale and error correction—two of the sphere’s grand challenges—is encouraging. If that momentum continues within the coming years, one among these machines could lastly clear up the primary helpful drawback that no conventional pc ever might.

Picture Credit score: Atom Computing

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments