In an interview during his recent visit to India, Gil said he also met some government officials including Rajeev Chandrasekhar, minister of state (MoS) for electronics and information technology, with whom he shared how IBM could help create a national quantum plan in India. Gil also explained how enterprises and governments can benefit from technologies such as the hybrid cloud, edge computing, quantum computing, and shared his thoughts on Web3. Edited excerpts:
IBM outlined its commitment almost five years back to grow a quantum ready workforce and build an ecosystem to nurture the community in India. What’s the progress?
We have made a tremendous amount of progress and in fact, it was one of their core aspects of the discussion with the minister (Rajeev Chandrashekar) that I had. They intend to make sure that India is a powerhouse in the world of quantum skills and quantum technologies. In this context, access to technology is crucial. That’s why we’re committed to the open-source environment — the most-widely one used around the world is Qiskit. We’re seeing tremendous adoption in terms of advocates and quantum ambassadors here in India, and we’re also having many conversations right now with different Indian Institutes of Technology (IITs) and leading centers for training, to develop a curriculum and certification. The Qiskit (to learn quantum computation) textbook is now also available in Tamil, Bengali, and Hindi. We’re going to be running many workshops and lots of programs around it. I think there’s a tremendous opportunity and part of our commitment is to figure out a way to grow these broad-based skills and talent programming in India for quantum.
What’s the progress on quantum computers, and how do they currently compare with supercomputers?
Most of the computation will continue to run on classical computers whether in central processing units (CPUs) or accelerators (GPUs or gaming processor units), or AI, but there are several important problems that are very well suited for quantum computers. One of them is the dimension of simulating and modeling our world.
It turns out there are also mathematical problems of great importance that are well suited to quantum computers such as cryptography and factoring. Blockchain, crypto, and other such technologies are going to have to be adapt and change because of the advances of quantum.
We have over 180 institutions that are part of the IBM Quantum Network and they include some of the largest corporations in the world from the financial sector like Goldman Sachs and JP Morgan Chase and Wells Fargo, Mizuho Bank, and others such as Daimler and also big energy companies in the oil and gas sector, and some materials companies. There’s also a huge appetite in universities and students with research laboratories participating in this.
But when will the world get to see a stable quantum computer that will work around current limitations such as noise leading to higher error rates, interference, etc.?
We already have quantum computers but they, as you correctly pointed out, have limitations. We still haven’t crossed the threshold of quantum advantage (the so-called quantum advantage or quantum supremacy is a point when a quantum system performs functions that today’s classical computers cannot) but they are quantum computers, nonetheless. We have built over 30 of them in the last 4-5 years, of which over 20 quantum computers are active right now with IBM providing access to them through the IBM Cloud. Every day we run three-and-a-half billion quantum circuits, running on actual quantum hardware.
The roadmap we shared is that in the first year we will build a 100 qubit quantum computer, this year, we’re going to build a 433 qubit machine and next year, a machine with over 1000 qubits (a quantum computer comprises quantum bits or qubits that can encode a one and a zero simultaneously. This property allows them to process a lot more information than traditional computers, and at unimaginable speeds.).
The error rate of the qubits is also improving tremendously (we can get to 10 to the minus 4 error rates). And the algorithms and software — the techniques we use for error mitigation and error correction — is also improving. If you combine all of this, (and) if you want to be conservative, we’re going to see the quantum advantage this decade.
What’s the roadmap for quantum computing?
We have seen AI-centric or GPU-centric supercomputers, and we are most definitely going to see quantum-centric supercomputers. This is how it may work out. Imagine a quantum computer with hundreds or thousands of qubits with a single cryostat (heat creates error in qubits hence they need to be cooled to near absolute zero in a device called a cryostat that contains liquid helium), and now imagine a quantum data center with multiple cryostats in a data center.
You could build a data center that has thousands or tens of thousands of qubits but the connection between these different cryostats in the first generation is classical. If you’re smart enough to take a problem and partition the problem in such a way that you can run parallel workloads in the quantum machines and then connect them and stick them classically, you still incur an exponential cost in the classical piece but can still get to a good answer.
The next step is to combine the field of quantum communications and quantum computing. It’s a roadmap over the next 10-20 years, but we will see quantum supercomputers and they are going to work in concert with the current supercomputers.
I would now like to segue into how the adoption of hybrid cloud has increased in enterprises, and its evolution both from a market and research point of view.
From a market lens, if you look at any kind of medium-size or large-scale business (from a market lens), this reality (of a hybrid cloud) is there. Simply put, the question is how to make the hybrid cloud strategy work and continue to modernize the infrastructure so that the workloads and processes run optimally across it. That explains why the open-source component and the acquisition of Red Hat were so crucial — to have an operating system based on Linux and having a container architecture based on Kubernetes. This is a $1 trillion-plus annual market opportunity for us to provide the middleware, infrastructure, and right skills through IBM Consulting to help our clients operate and succeed in that environment.
From a computer science lens, we have seen the huge importance of edge computing and if you look beyond, you will also see the heterogeneous nature of architectures based on microprocessor-centric architectures like the AI accelerator-centric architectures and quantum-centric architectures in the future. So, it’s critical to build a very heterogeneous, very distributed, computational environment and ensure it is architected properly and works.
Speaking about AI, even as big data is important, there is much effort to do a lot more with less data.
Yeah, it’s true. One extreme continues to be a story of how you learn from large amounts of data — we’re talking about taking advantage of advances in self-supervision to be able to train large foundational models, and a good example is in Natural Language Processing (NLP). But the challenge our clients have had with AI is that the data science portion of it — the data labeling and training pipeline consumes 90% of the resources and (also consumes) a lot of time. So, anything we can do to reduce this is hugely important. Then there’s another vector — how do you inherently learn from less with much fewer examples, with few short learnings, and so on? This is an area where we invest a lot.
Semiconductors is another critical part of IBM Research. In May 2021, IBM announced its second-generation nanosheet technology has paved a path to the 2 nm node. Please explain the significance of this development.
The topic of semiconductors today has become a national and international priority. I meet with government leaders around the world and now, politicians and citizens are realizing the importance of semiconductors because they (semiconductors) are literally in everything – cars, refrigerators, phones, and computers. The semiconductor industry is a half-a-trillion-dollar industry. By all accounts, this is going to double in size in the coming decade. To enable that growth, innovation, and manufacturing capability must go hand in hand.
IBM plays a central role on the innovation side in creating the new technology that enables manufacturers to bring that capability to the world at scale. As an example, the announcement last year on the 2-nanometer technology is incredibly exciting because there’s almost nothing more impactful than a next-generation transistor (which allows a chip to fit up to 50 billion transistors in a space that is the size of a fingernail). We also recently (in December 2021) announced the Vertical Field Effect transistor (VTFET) — a design aimed at enabling smaller, more powerful, and energy-efficient devices. Of course, we use the expertise we have in semiconductor technology to build quantum computing as well.
What’s your role as a member of the National Science Board?
The National Science Board is the entity that is the governing board of the National Science Foundation (NSF) of the United States. It funds a very significant part of all the basic science work. The characteristic of that funding is that it is curiosity-driven, and not driven by application. It’s about advancing the frontiers of mathematics, physics, chemistry, and biology, and is hugely important that we, as societies, defend and support the need for that kind of discovery. Without that investment, its (discovery) takes many decades.
Before we wrap up, I would love to have your thoughts on Web 3.0 and metaverse—the two buzzwords that are currently taking the industry by storm.
I like to see the foundations of these areas. On Web 3.0, it’s back to the computer science story. It’s about how we build the next generation of truly distributed computational environments. We touched on this lens from the perspective of a hybrid cloud. But this is complementary to that because it is about: How do you build in this case a web architecture and a network architecture that is inherently distributed by design? This requires thinking about a lot of foundation things–right from the security dimension to the semantic nature of the relationship around that. The previous version (of the web) was all about interactivity. Now it’s also about how we bring together sensors and the fact that we have computers everywhere, and humans interacting with it. So, it’s this next-generation architecture I think is fundamental.
As for the metaverse, maybe I’m not the most qualified person to talk about it, (but) obviously, it’s going to be a hugely important way to extend the way we entertain and collaborate. (But) I really would like that technology is also oriented toward (solving a) broader set of problems.
the App to get 14 days of unlimited access to Mint Premium absolutely free!
This UrIoTNews article is syndicated fromGoogle News