Skip to content

Quantum Computer Systems. Interview with Fred Chong and Yongshan Ding

by Roberto V. Zicari on November 6, 2020

” Quantum computing is incredibly exciting because it is the only technology that we know of that could fundamentally change what is practically computable, and this could soon change the foundations of chemistry, materials science, biology, medicine, and agriculture.” — Fred Chong.

I have interviewed Fred ChongSeymour Goodman Professor in the Department of Computer Science at the University of Chicago and Yongshan Ding,  PhD candidate in the Department of Computer Science at the University of Chicago, advised by Fred Chong. They just published an interesting book on Quantum Computer Systems.

RVZ

Q1. What is quantum computing?

Ding: Quantum computing is a model of computation that exploits the unusual behavior of quantum mechanical systems (e.g., particles at minimal energy and distance scales) for storing and manipulating information. Such quantum systems have significant computational potential, as they allow a quantum computer to operate on an exponentially large computational space, offering efficient solutions to problems that seem to be intractable in the classical computing paradigm.

Q2. What are the potential benefits of this new paradigm of computing?

Chong:  Quantum computing is incredibly exciting because it is the only technology that we know of that could fundamentally change what is practically computable, and this could soon change the foundations of chemistry, materials science, biology, medicine, and agriculture.

Quantum computing is the only technology in which every device that we add to a machine doubles the potential computing power of the machine. If we can overcome the challenges in developing practical algorithms, software, and machines, quantum computing could solve some problems where computation grows too quickly (exponentially in the size of the input) for classical machines.

In the short term, quantum computing will change our understanding of the aforementioned sciences that fundamentally rely on understanding the behavior of electrons. A classical computer uses an exponential number of bits (electrons) to model the positions of electrons and how they change. Obviously, nature only uses one electron to “model” each electron in a molecule. Quantum computers will use only a small (constant) number of electrons to model molecules

Q3. What is the current status of research in quantum computing?

Ding: It is no doubt an exciting time for quantum computing. Research institutions and technology companies worldwide are racing toward practical-scale, fully programmable quantum computers. Many others, although not building prototypes by themselves, are joining the force by investing in the field of quantum computing. Just last year, Google used their Sycamore prototype to demonstrate a first “quantum supremacy” experiment, performing a 200-second computation that would otherwise take days on a classical supercomputer[1],[2]. We have entered, according to John Preskill, the long-time leader in QC, a “noisy intermediate-scale quantum” (NISQ) technology era, in which non-error-corrected systems are used to implement quantum simulations and algorithms. In our book, we discuss several of the recent advances in NISQ algorithm implementations, software/hardware interface, and qubit technologies, and highlight what roles computer scientists and engineers can play to enable practical-scale quantum computing.

Q4. What are the key principles of quantum theory that have direct implications for quantum computing?

Chong:  Paraphrasing our research teammate, Peter Shor, quantum computing derives its advantage from the combination of three properties:  an exponentially-large state space, superposition, and interference.  Each quantum bit you add to a system increases the state of the machine by 2X, resulting in exponential growth with the number of qubits.  These states can exist simultaneously in superposition and manipulating these states creates interference in these states, resulting in patterns that can be used to solve problems such as the factoring of large numbers with Shor’s algorithm.

Q5. What are the challenges in designing practical quantum programs?

Chong: There are only a small number of quantum algorithms that have an advantage over classical computation.  A practical quantum program must use these algorithms as kernels to solve a practical problem.  Moreover, quantum computers can only take a small number of bits as input and return a small number of bits as output.  This input-output limitation constrains the kinds of problems we can solve.  Having said that, there are still some large classes of problems that may be solved by quantum programs, such as problems in optimization, learning, chemistry and physical simulation.

Q6. What are the main challenges in building a scalable software systems stack for quantum computing?

Ding: A quantum computer implements a fundamentally different model of computation than a modern classical computer does. It would be surprising if the exact design of a computer architecture would extend well for a quantum computer. The architecture of a quantum computer resembles a classical computer in the 1950s, where device constraints are so high that the full-stack sharing of information is required from algorithms to devices. Some example constraints include hardware noises, qubit connectivity/communication, no copying of data, and probabilistic outcomes. In the near-term, due to these constraints, it is challenging for a quantum computer to follow the modularity and layering models, as seen in classical architectures.

Q7. What kind of hardware is required to run quantum programs?

Chong:  There are actually several technologies that may run practical quantum programs in the future.The leading ones right now are superconducting devices and trapped ions.  Optical devices and neutral atoms are also promising in the future.  Majorana devices are a further future alternative that, if realized, could have substantially better reliability than our current options.

Q8. Quantum computers are notoriously susceptible to making errors (*). Is it possible to mitigate this?

Ding: Research into the error mitigation and correction of quantum computation has produced several promising approaches and motivated inter-disciplinary collaboration – on the device side, researchers have learned techniques such as dynamical decoupling and its generalizations, introducing additional pulses into the systems that filter noise from a signal; on the systems software side, a compiler tool-flow that is aware of device noise and application structure can significantly improve the program success rate; on the application side, algorithms tailored for a near-term device can circumvent noise by a classical-quantum hybrid approach of information processing. In the long term, when qubits are more abundant, the theory of quantum error correction allows computation to proceed fault tolerantly by encoding quantum state such that errors can be detected and corrected, similar to the design of classical error-correcting codes.

Q9. Some experts say that we’ll never need quantum computing for everyday applications. What is your take on this?

Chong:  It is true that quantum computing will likely take the form of hardware accelerators for specialized applications.  Yet some of these applications include general optimization problems that may be helpful for many everyday applications.  Finance, logistics, and smart grid are examples of applications that may touch many users every day.

Q10. There is much speculation regarding the cybersecurity threats of quantum computing (**). What is your take on this?

Chong: In the long term, quantum machines may pose a serious challenge to the basis of modern cryptography. Digital commerce relies upon public-key cryptography systems that use a “pseudo-one-way function.” For example, it should be easy to digitally sign a document, but it should be hard to reverse-engineer the secret credentials of the person signing from the signature. The current most practical implementation of a one-way function is to multiply two large prime numbers together. It is hard to find the two primes from their product. All known classical algorithms take time exponential in the number of bits in the product (RSA key). This is the basis of RSA cryptography. Large quantum computers (running Shor’s algorithm) could find these primes in n^3 time for an n-bit key.

This quantum capability would force us to re-invent our cryptosystems to use other means of encryption that are both secure and inexpensive. An entire field of “post-quantum cryptography” has grown to address this problem. Secure solutions exist that resist quantum attacks, but finding any that are as simple as the product of two primes has been challenging. On the positive side, quantum computers and quantum networks may also help with security, providing new means of encrypting and securely communicating data.

Q10. What is the vision ahead for quantum computing?

Chong: Practical quantum computation may be achievable in the next few years, but applications will need to be error tolerant and make the best use of a relatively small number of quantum bits and operations.   Compilation tools will play a critical role in achieving these goals, but they will have to break traditional abstractions and be customized for machine and device characteristics in a manner never before seen in classical computing.

Q11. How does this book differ from other Quantum Computing Books? 

Ding: This is the first book to highlight research challenges and opportunities in near-term quantum computer architectures. In doing so, we develop the new discipline of “quantum computing systems”, which adapts classical techniques to address the practical issues at the hardware/software interface of quantum systems. As such, this book can be used as an introductory guide to quantum computing for computer scientists and engineers, spanning a range of topics across the systems stack, including quantum programming, compiler optimizations, noise mitigation, and simulation of quantum computation.

 ———————————

Professor Fred Chong, April 3, 2019 at Crerar. (Photo by Jean Lachat)

Professor Fred Chong, April 3, 2019 at Crerar. (Photo by Jean Lachat)

Fred Chong is the Seymour Goodman Professor in the Department of Computer Science at the University of Chicago. He is also Lead Principal Investigator for the EPiQC Project (Enabling Practical-scale Quantum Computing), an NSF Expedition in Computing. Chong received his Ph.D. from MIT in 1996 and was a faculty member and Chancellor’s fellow at UC Davis from 1997-2005. He was also a Professor of Computer Science, Director of Computer Engineering, and Director of the Greenscale Center for Energy-Efficient Computing at UCSB from 2005-2015. He is a recipient of the NSF CAREER award, the Intel Outstanding Researcher Award, and 9 best paper awards. His research interests include emerging technologies for computing, quantum computing, multicore and embedded architectures, computer security, and sustainable computing. Prof. Chong has been funded by NSF, DOE, Intel, Google, AFOSR, IARPA, DARPA, Mitsubishi, Altera and Xilinx. He has led or co-led over $40M in awarded research, and been co-PI on an additional $41M.  

 yongshan_ding

Yongshan Ding is a PhD candidate in the Department of Computer Science at the University of Chicago, advised by Fred Chong. Before UChicago, he received his dual B.Sc. degrees in Computer Science and Physics from Carnegie Mellon University. His research interests are in the areas of computer architecture and algorithms, particularly in the context of quantum computing. His work spans broadly in the theory and application of quantum error correction, efficient and reliable quantum memory management, and optimizations at the hardware/software interface. 

Resources

9781681738666

– Quantum Computer Systems. Research for Noisy Intermediate-Scale Quantum Computers. Yongshan Ding, University of Chicago, Frederic T. Chong, University of Chicago, Morgan & Claypool Publishers, ISBN: 9781681738666 |  2020 | 227 Pages, Link to Web Site


Related Posts
– On using AI and Data Analytics in Pharmaceutical Research. Interview with Bryn Roberts. ODBMS Industry Watch. September 10, 2018

Quote from Industry:  

” What I’m particularly excited about just now is the potential of universal quantum computing (QC). Progress made over the last couple of years gives us more confidence that a fault-tolerant universal quantum computer could become a reality, at a useful scale, in the coming years. We’ve begun to invest time, and explore collaborations, in this field. Initially, we want to understand where and how we could apply QC to yield meaningful value in our space. Quantum mechanics and molecular dynamics simulation are obvious targets, however, there are other potential applications in areas such as Machine Learning. I guess the big impacts for us will follow “quantum inimitability” (to borrow a term from Simon Benjamin from Oxford) in our use-cases, possibly in the 5-15 year timeframe, so this is a rather longer-term endeavour.” — Dr. Bryn Roberts, Global Head of Operations for Roche Pharmaceutical Research & Early Development.

Follow us on Twitter @odbmsorg

From → Uncategorized

No comments yet

Leave a Reply

Note: HTML is allowed. Your email address will not be published.

Subscribe to this comment feed via RSS