Q2B25 Silicon Valley, being held this week in Santa Clara, California, is billed as “The Roadmap to Quantum Value,” focusing on practical quantum technologies. This is the eighth consecutive year for the conference.
Quantum computing is not typically an area that I cover, although I do have one other previous article, Leaps in Quantum Computing, that I wrote about two years ago that focused more on the computational complexity implications and BQP.
There’s something exciting about attending an event that’s filled with a lot of hope for the future of quantum computing technology, and which has many small players and startups as well as large corporations exploring several different avenues of research.
One panel session focus…
Q2B25 Silicon Valley, being held this week in Santa Clara, California, is billed as “The Roadmap to Quantum Value,” focusing on practical quantum technologies. This is the eighth consecutive year for the conference.
Quantum computing is not typically an area that I cover, although I do have one other previous article, Leaps in Quantum Computing, that I wrote about two years ago that focused more on the computational complexity implications and BQP.
There’s something exciting about attending an event that’s filled with a lot of hope for the future of quantum computing technology, and which has many small players and startups as well as large corporations exploring several different avenues of research.
One panel session focused on innovation and investment in quantum technologies. The panel was moderated by Peter Olcott, principal at First Spark Ventures, and featured panelists Scott Buchholz, managing director at Deloitte Consulting; Jamie Garcia, director of strategic growth and quantum partnerships at IBM Quantum; and Raul Camposano, managing partner at Silicon Catalyst. The picture below shows the panelists along with Rich Curtin, managing partner at Silicon Catalyst, who delivered a short introduction before the discussion got started.

Panelists (Back row, left to right): Scott Bucholz, Jamie Garcia, Raul Camposano, Moderator: Paul Olcott, (Front) Rich Curtin
The panel focused on the fact that while new quantum technologies are advancing rapidly, their true value will only be realized through scalable, real-world applications. Discussing the interplay between hardware, software, and hybrid quantum–classical systems, as well as the role of public–private partnerships, attendees got strategic insights of how quantum technologies are moving from proof of concept to practical deployment—and what it will take to unlock their projected multi-trillion-dollar economic impact by the next decade.
First Spark is a deep tech venture capital fund investing in breakthrough technologies. Peter said that he specializes in investments in AI, biotech, and quantum technologies.
Each panelist gave a brief bio as an introduction.
Jamie Garcia has a Ph.D. in chemistry and started doing benchwork. She has a background in organic chemistry and polymers and applied that expertise at IBM to develop photoresist. As it turns out, some of the same materials are used for making qubits. Attending a talk on ground state energies at Yorktown Heights was an “ah ha” moment for Jamie on how she could connect her own chemistry work and quantum computing. It was no longer just sci fi to her and she took a leap of faith and decided to become a part of it so when she got an opportunity at IBM in a computational chemistry and physics team as an operations manager she took it.
Scott Buchholz’ father was an experimental high-energy physicist, and his mother was a biochemist. Scott said that he sat through a lot of technical discussions during dinners growing up. My sons would probably empathize with Scott, also having a mother who has a doctorate in material science. Several years ago, Scott started seeing clients who were going into the quantum computing space. Scott talked with the company CTO, who agreed that Deloitte should get into the area. The CTO then came back a couple of weeks later with a greenlight to go ahead.
Raul Camposano started by saying that he’s not sure that he’s totally taken the leap of faith yet in quantum computing. Formerly as the CTO at Synopsys, he focused on how to make things run faster and tackled NP hard problems. Raul said that he is fascinated at potentially being able to exponentially accelerate computations. Raul said that he read Paul Dirac’s book (The Principles of Quantum Mechanics), and that its description of state spaces in terms of superposition was the most useful book that he’s read in the field. He joined Silicon Catalyst 10 years ago and Silicon Catalyst is an incubator/accelerator that specializes in semiconductor companies and partners with other companies who contribute in-kind with tools and foundry shuttles. Silicon Catalyst hadn’t performed any quantum computing evaluations until recently when they had 16 quantum computing companies apply. Silicon Catalyst hasn’t accepted any into the program yet. Silicon Catalyst has hundreds of advisors on board and about 25 have a quantum computing background and expertise.
Peter commented that it’s a common story of it’s hard to get investing but encouraged people to keep trying. He asked Jamie about the changes since before and after she started working on quantum computing.
Jamie said that the hardware is progressing at a rate that couldn’t have been predicted. Technologies are set to appear in 2029 that many weren’t originally expecting to see in their lifetimes. The team wants to see quantum computers that are actually being used for practical problems. In 2016 IBM put a 5-qubit device on the cloud for anyone to use. Since then, IBM has deployed dozens of quantum computers around the world in places like Japan, Germany, Spain, and the US. IBM chose superconducting qubits, based on speed and the ability to build them and not necessarily towards solving chemistry problems. IBM’s next generation Nighthawk is built for fault tolerance and degree 6 connectivity. Couplers for degree 6 connectivity are targeted for the end of this year. Nighthawk is 120 qubits and the next scale up is to run 200 qubits and is targeted to be available by 2029 on the cloud (in a Poughkeepsie data center). Goals are to get bigger and bigger and push down error rates.
Paul asked Raul to compare the state of quantum computing to his early experience with logic synthesis when it was introduced to the industry.
Raul said that hardware is one thing, but software is another. One of the plenary speakers earlier in the day said, first identify which types of problems can be solved with quantum computing and then map those problems. All interesting problems are NP-hard and currently tools use heuristics built on years of experience to address them. With quantum computing, programming quantum computers is more like designing complex circuits. In the US, there are approximately 80,000 hardware designers and about 5 million software engineers. Raul asked, how many would write quantum computing software? Tools to enable quantum computing programming will be key and it’s still an evolving field.
Paul asked Scott how best to communicate breakthrough advances and get companies ready?
Scott said that one important factor is that quantum computing is not a better version of today’s computing. Many view quantum computing as if it were some kind of new magic pixie dust that can be sprinkled on their problems to make solving them easier. Scott tells people that it’s about 2 years to get competent, roughly about the same amount of time for someone in data sciences to reach competence. Roadmaps indicate that in the next 1-3 years commercially relevant use cases will appear. Most of Deloitte customers are global 500 companies interested in solving today’s problems and need value today. Having customers ramp up on ML solutions that work today on today’s classical hardware can also be a pathway to getting up to speed on quantum computing.
Paul mentioned that businesses tend to work on fear or greed. Is Q-Day, or some optimization problem the driver? What are the kinds of strategies used when pitching here?
Scott said that you can view this as purchasing a call option if you think that quantum computing might take off, or as an insurance policy, if you don’t think it’s going anywhere but don’t want competitors to get too far ahead in case it does. The other choices are to wait or just ignore it. On cyber security, fear maybe isn’t a driver, but more need to get involved. Some think that cloud-providers will take care of it. Cloud providers don’t always view it as only their problem. Quantum secure techniques are relevant now so that any security solutions currently under development have a lifetime into the future.
Paul asked Raul if he had any suggestions for innovators now if they want to get into this space.
Raul said that it’s hard to believe that a startup can handle the whole stack so small companies should focus on a problem with care towards being able to be integrated into a total solution. The Danish 55North are putting in money, University of Munich and Silicon Catalyst are providing incubator opportunities to help startups. Staying connected and focused and focusing on one problem that can be integrated into a total solution is key.
Paul asked Jamie about all the different possible approaches using photons, Ions, superconductors, etc. and for a breakdown.
Jamie said that fault tolerance is important to actualize many of the algorithms. Applications like cryptography depend on fault tolerance and a certain number of qubits. The last year to year and a half ago, the focus has shifted to locally decodable codes (LDC codes as opposed to surface codes), the number of parameters, how many qubits and how to layout the qubits to get to a solution. This is so that what you’re running now and between 2029, the API will be the same, but the 2029 quantum computer will have more qubits, gates and support for larger scales. That’s also why fault tolerance is important (also true outside of quantum as systems get larger). It’s just much earlier days in quantum computing. Qubits are sensitive and that’s why they are used. Temperature, radiation, and other noise sources can have an impact, but today we still can see quantum computing utility. Still useful in situations where it just gives an answer that can provide insight into your problem. Fault tolerance is important. It’s all about the applications, circuits, AI and AI-HPC are important as we march towards quantum advantage. Quantum computing will be coupled with HPC.
Paul asked if fault-tolerance can be made visible to the user.
Jamie said that there’s interesting research on how to balance between QPU, CPU and GPU. IBM is partnering with AMD on quantum computer integration. It’s not yet at scale, but it’s an interesting new avenue to explore in this heterogeneous architecture.
A question from the audience asked Jamie about using IBM quantum computing. The questioner asked if using IBM quantum for quantum computing finance software is useful? He said that Nighthawk saw a large jump in available depth. How do we know what to expect going forward?
Jamie said that Heron (degree 4) is going to degree 6 for Nighthawk and IBM is working on couplers. On the innovation roadmap, users will see increases in qubits and number of gates that can be run while they continue to push the errors down. As new capabilities are developed, they will be integrated into the system. Users will be able to run more gates, more types of gates, and in 2029 there will be fault tolerance, all “under the hood” from the user’s viewpoint.
Rich Curtin asked the final question to the panel to wrap up the session. He asked hardware, hardware and software, where would you spend your time?
Scott answered that as a long-time software guy the answer is apparent. You can work backwards of forward to get there.
Jamie said that her CEO says 90% of the value in quantum computing is going to come from the applications. Algorithm development is important and recently Shor’s algorithm being reduced by some clever reorganization helps illustrate that case. Hybrid architectures, software, applications and algorithms will all play important roles.
Paul said that addressing hardware scaling loss is important and he sees a possible bottleneck 5-6 years out and getting to a depth of millions. Their bet is on the hardware side.
Raul stated that he is on the SW side. He believes that there will be huge progress in emulation of quantum computing. Quantum computing simulations are already available in 5000 lines of Java to help enablement, so definitely software.