Skip to Main Content

Strategic Business Insights (SBI) logo

Nanoelectronics November 2017 Viewpoints

Technology Analyst: Guy Garrud

Quantum Supremacy

By Alastair Cunningham
Cunningham is an independent consultant specializing in nanomaterials and electronics.

Why is this topic significant?

Quantum systems could yield a step change in computing power. Recent developments suggest that quantum computers could soon outperform leading supercomputers, heralding a major milestone for the technology.

Description

In May 2017, Google announced its intention to produce, by the end of the year, a 49-qubit quantum-computing chip that could potentially outperform existing supercomputers. Such a demonstration of quantum supremacy would be a groundbreaking advance, with widespread implications for the semiconductor industry. Google is approaching this feat in stages, recently producing a 9-by-1 array of qubits and testing its fabrication technology on a 2-by-3 array of qubits. The challenges in scaling to a 7-by-7 array are numerous—with larger systems' resulting in higher error rates and fabrication challenges. The team plans to achieve quantum supremacy by generating a simulated random output from the integrated qubit circuit—a task with which conventional supercomputers struggle.

Meanwhile, in August 2017, Microsoft and Purdue University signed a five-year agreement to work together on the development of a functioning quantum computer. Other members of the collaboration include the Delft University of Technology and the University of Sydney. The team will focus on the development of a "scalable topological quantum computer" that is not as susceptible to interference from its surroundings as are other types of systems.

Implications

Google's efforts to prove quantum supremacy unequivocally have, at present, no practical applications. Indeed, an array of 49 qubits remains a considerable way from what physicists calculate will be necessary to perform the types of calculations that could yield useful results. For example, a leading Google quantum-computing scientist stated in Nature in March 2017 that "factorizing a 2,000-bit number in one day, a task believed to be intractable using classical computers, would take 100 million qubits." However, Google's current efforts—if successful—would go some way toward justifying the increasingly large resources that go to research in this field. It would also represent a significant step on the road to larger systems that factor in error correction—the reason why quantum systems have to be so large.

The Microsoft approach—focusing on topological quantum computers—addresses the issues relating to scalability and stability associated with other quantum technologies. However, the project is ambitious and will require the team to make significant and simultaneous progress in the fields of materials science, solid-state physics, electrical engineering, and computer architecture.

Impacts/Disruptions

When quantum computing begins to find use in practical situations, it will likely result in a step change in how computers affect daily lives—in a manner similar to that of the technological revolution caused by developments in the semiconductor industry. A clear and irrefutable demonstration of the superiority of quantum computing—as opposed to the achievements of D-Wave, which are the subject of great debate—will represent a major milestone along that road. Of course, other major players are also making advances in this field. For example, in March 2017, IBM announced plans to construct a commercially available 50-qubit system that would also outperform conventional computers.

Scale of Impact

  • Low
  • Medium
  • High
The scale of impact for this topic is: High

Time of Impact

  • Now
  • 5 Years
  • 10 Years
  • 15 Years
The time of impact for this topic is: 10 Years to 15 Years

Opportunities in the following industry areas:

Computing, security, pharmaceutics, big data, artificial intelligence

Relevant to the following Explorer Technology Areas:

DNA-Data-Storage Commercialization

By Alastair Cunningham
Cunningham is an independent consultant specializing in nanomaterials and electronics.

Why is this topic significant?

Exponentially increasing volumes of data are placing a strain on existing storage technologies. DNA-based techniques are emerging as a potential means of storing data at extremely high densities.

Description

In May 2017, Microsoft researchers announced their intention to use DNA-data storage in practical applications before the end of the decade. Microsoft's precise aim is to develop a "proto-commercial system in three years storing some amount of data on DNA in one of our data centers, for at least a boutique application." The eventual device would initially be the size of a large office printer. However, sizes are likely to decrease as further technological developments occur.

The key advantage to using DNA for data-storage applications is its unparalleled density—in March 2017, researchers from Columbia University and the New York Genome Centre demonstrated the potential to store up to 215 petabytes in a single gram of DNA. Other advantages include DNA's durability and its fundamental importance to humans—technology based on DNA is unlikely ever to become obsolete or unreadable. The principal barriers to the commercialization of DNA-data storage include the price and the write speed. Microsoft claims that DNA-storage costs are currently 10,000 times too high for the technology to see wide adoption. Similarly, write speeds would have to improve by a factor of approximately 250,000 to be competitive with current data-storage techniques.

Implications

As conventional data-storage technologies reach their physical limitations, recording the exponentially increasing volumes of data will require alternative approaches. Microsoft's announcement demonstrates the extent to which major players within the semiconductor industry consider that DNA-based data storage could address this issue and become a substantial commercial success.

The technology, facing both cost-based and technical challenges, must improve if it is to compete with current industry standards and alternative emerging data-storage methods. However, these barriers are not insurmountable to a resource-rich integrated-circuit industry. A concerted research and development effort by several major players, already of the opinion that DNA storage could yield a technological step change, is likely to accelerate the commercialization of this technology.

Impacts/Disruptions

In the first instance, DNA data storage is likely to find use in situations or applications in which the recording of large volumes of information for long periods is particularly important. The technology could supplant magnetic tape drives, which commonly find use in long-term archiving. Examples of specific applications include the archiving of legal documents, the storage of medical records, and the recording of police-body-cam video output.

In addition to the major players developing technology in this field, a range of smaller companies are also focusing on the development of the DNA technology. For instance, Microsoft has close links with Twist Bioscience—a US-based DNA manufacturer. Other examples of start-ups working on the development of DNA-writing technology for data-storage applications include DNAScript, Molecular Assemblies, Catalog Technologies, and Helixworks. Securing intellectual property and partnerships with large semiconductor players could potentially prove highly lucrative for these firms.

Scale of Impact

  • Low
  • Medium
  • High
The scale of impact for this topic is: High

Time of Impact

  • Now
  • 5 Years
  • 10 Years
  • 15 Years
The time of impact for this topic is: 5 Years to 10 Years

Opportunities in the following industry areas:

Data storage, semiconductor, big data

Relevant to the following Explorer Technology Areas: