Google’s Quantum Computer Claims To Do 10k-year Calculation In Just 200 Seconds
This week Google has achieved a massive milestone in the field of Quantum Computing that could completely revolutionize the methods of computing and the way we process data, the search engine giant is clamming unbelievable quantum computing results, the company said that they have achieved quantum supremacy.
Google has published there latest study in Nature, where they are clamming that their quantum system had executed a calculation in 200 seconds that would have taken a classic computer 10k years to complete.
In this new research paper, the scientists explain how they have designed, tested and developed a supreme quantum processor that could perform highly complex computational tasks in a time of 3 minutes.
In their official statement, Google said: “Our machine performed the target computation in 200 seconds, and from measurements, in our experiment, we determined that it would take the world’s fastest supercomputer 10,000 years to produce the similar output”.
The name of this quantum processor is Sycamore and it is of 54-qubit interconnected in a lattice pattern.
This quantum computer processor chip looks very much like a normal computer and it is placed in a casing at the bottom of structure shaped like an upside-down wedding cake, held in a vacuum chamber.
The environment is progressively colder with each tier until it’s at the 15-milliKelvin operating temperature. A mess of wires sends tiny microwave pulses to the qubit, causing it to take on excited states that are measured by another tiny component attached to the plus sign.
If we talk about our classical computer, they function in a binary fashion: they carry out tasks using tiny fragments of data known as bits that are only ever either 1 or 0. But fragments of data on a quantum computer, known as qubits, can be both 1 and 0 at the same time.
Each qubit is made from a tiny, plus sign-shaped loop of superconducting wire. This property, known as superposition, means a quantum computer, made up of several qubits, can crunch an enormous number of potential outcomes simultaneously, which ultimately leads to faster execution.
Google’s CEO Sundar Pichai said: “For those of us working in technology, it’s the ‘hello world’ moment we’ve been waiting for the most meaningful milestone to date in the quest to make quantum computing a reality.”
“This demonstration of quantum supremacy over today’s leading classical algorithms on the world’s leading supercomputers is truly a remarkable achievement,” William Oliver, a computer researcher at the Massachusetts Institute of Technology, wrote in a comment piece on the discovery.
“This experiment establishes that today’s quantum computers can outperform the best conventional computing for a synthetic benchmark,” says Travis Humble, the director of Oak Ridge’s Quantum Computing Institute.
In this research along with Google, NASA, and the Oak Ridge National Laboratory were also included.
On the other side of the competition, IBM one of the major players in Quantum Computing is not ready to believe in Google’s results and has pointed out many questions on the research.
IBM has claimed that the traditional supercomputer used in Google’s experiment (IBM’s own Summit machine), wasn’t utilized efficiently, which they say explains the less-than-flattering 10,000-year time lag.
By calibrating Summit differently for the same experiment, IBM argues “an ideal simulation of the same task can be performed on a classical system in 2.5 days and with far greater fidelity” and the researchers have published both a blog post and a working paper to make their case.
“While we believe that some quantum computations will be out of reach of any conventional computer, it is a challenge to argue that any particular set of processes cannot be simulated through some suitable trick,” quantum information theorist Stephen Bartlett from the University of Sydney told ScienceAlert last month.
“I suspect that the first claims of quantum supremacy will be followed by a lengthy period of contention, where scientists push the limits of conventional supercomputers to find a way to simulate these claimed demonstrations,” he added.
“It is likely that the classical simulation time, currently estimated at 10,000 years, will be reduced by improved classical hardware and algorithms, but, since we are currently 1.5 trillion times faster, we feel comfortable laying claim to this achievement,” says one of the group, Brooks Foxen from UC Santa Barbara.
In return to IBM’s and other researchers’ questions, Google’s lead researcher John Martinis said IBM’s initial retort remained hypothetical and had to be substantiated.
“We’re looking forward to when people actually run the idea on Summit and check it and check our data because that’s part of the scientific process – not just proposing it but actually running it and checking it,” Martinis said.
More in AI
World’s First Artificial Intelligence University Opens in Abu Dhabi UAE
The Start Of AI Killing Jobs Of Models, Here Are 100,000 Free AI-generated Potentiates
Facebook Selects 6 Projects From India For AI Ethics Research Awards
MIT Creates World’s First Psychopath AI By Feeding It Reddit Violent Content
Elon Musk’s AI project to replicate the human brain receives $1 billion from Microsoft
What was the computational task?
For quantum encryption, everyone is very popular, but for quantum decryption??
These two days suddenly estimate, no matter how long you calculate the answer to the quantum decryption, it is just a calculation of the combination, not necessarily the only password answer, but also log in/Password trial and error?
(3^53 /10 million units simultaneously login/pw * 0.003ms)/60/60/24/365= ??
How long does it take to test in total? Can the Newton method also reduce the time by half?
It takes a lot of genius to solve the longest 184391.6064 Years?
I don’t know how these experts are solved in practice.
Experts are welcome to correct my mistakes! Thank you!!
Google is dominating in search, map, mobile and now for supercomputer. Great.
Pingback: The Future of Computers - My Pen My Friend
Pingback: The Year in Math and Computer Science -> Mathematicians and computer scientists made big progress in number theory, graph theory, machine learning and quantum computing, even as they reexamined our fundamental understanding of mathematics and neural ne
Pingback: The Year in Math and Computer Science -> Mathematicians and computer scientists made big progress in number theory, graph theory, machine learning and quantum computing, even as they reexamined our fundamental understanding of mathematics and neural ne