Monday, February 2, 2026

Physics Works, Engineering Makes It Work

My post Saturday on Making Quantum Supercomputing Qubits received an interesting (and excellent) comment from a reader on LinkedIn. 

Summarizing, the commenter argued quantum computing metrics (coherence times, yields, precision) are measured on idle qubits, not under real computational loads with circuits, crosstalk, and error correction. Once you add gates and routing, stability requirements exceed current improvements by orders of magnitude. And, after decades, there's still no error-corrected logical qubit running useful work. 

My comment back “Agree. The physics works. Scaling remains unsolved."This got me thinking this morning. STEM students often ask about major differences. One of the most common: "What separates physics from engineering?" Let's try to answer that using quantum as an example.

Physics discovers principles. Engineering builds systems that exploit those principles at scale. The gap between the two defines most hard technology problems.

Take quantum computing. Physicists proved you can trap ions, manipulate superconducting circuits, or use topological states to create qubits. The math works. Lab demonstrations show quantum advantage for specific problems. Physics is satisfied.

Engineering asks different questions. How do you manufacture 1,000 identical qubits when each one requires nanometer precision? How do you cool them to 15 millikelvin and hold that temperature while running computations? How do you shield them from electromagnetic interference in a data center? How do you get signals in and out without destroying coherence? How do you do all this reliably, repeatedly, and affordably?

Physicists build one qubit that works beautifully under perfect conditions. Engineers must build systems where hundreds of qubits work together under real conditions. Every quantum computing company today is struggling to bridge this gap.

The physicist optimizes for understanding. The engineer optimizes for constraints: cost, yield, thermal management, signal integrity, maintenance, supply chains. A physics experiment might use custom components that cost $500,000 and require manual calibration. An engineering solution needs off-the-shelf parts and automated processes.

You see this everywhere in technology. Physicists demonstrated photovoltaic effects in 1839. Engineers spent 150 years making solar panels cheap enough to matter. Physicists proved nuclear fusion in the 1930s. Engineers still cannot build a reactor that produces more energy than it consumes at useful scale.

The difference is not just scale though. It is thinking about failure modes, manufacturing tolerances, quality control, serviceability, and integration with existing infrastructure. Physics assumes ideal conditions. Engineering assumes Murphy's Law.

This creates tension. Physicists get frustrated when engineers say "that will never work in production." Engineers get frustrated when physicists dismiss practical constraints as details. Both are wrong. You need physics to know what is possible. You need engineering to make it real.

Quantum computing sits in this gap right now. The physics is spectacular. The engineering is brutal. Whoever solves the engineering problem first wins the market. That is always how it works. 

Saturday, January 31, 2026

Making Quantum Superconducting Qubits

I’ve written about qubits in the past - the basic unit of quantum information that can exist in a
superposition of both 0 and 1 states simultaneously until measured, unlike a classical bit which is always either 0 or 1. Let’s take a closer look on how a qubit can be made - there are three ways:

·      Superconducting Qubits

·      Trapped Ion Qubits

·      Photonic Qubits

Superconducting qubits are what IBM and Google use and I’ll cover in this post. They're tiny electrical circuits that only work at temperatures colder than deep space.


What Makes Them Quantum

At room temperature, these are just pieces of metal. Cool them to 15 millikelvin and they become superconductors. Electricity flows without resistance. Electrons move in perfect sync, acting like one quantum wave instead of individual particles.

The circuit has specific energy levels, like rungs on a ladder. Ground state is 0, first excited state is 1. The quantum effect lets the circuit be in both states at once until you measure it.

The Josephson Junction

The heart of a superconducting qubit is the Josephson junction. Two pieces of aluminum separated by an insulating barrier 1 to 2 nanometers thick. About 10 atoms wide.

At cryogenic temperatures, electrons quantum tunnel through that barrier even though classical physics says they can't. This creates a nonlinear inductance. Combined with a capacitor, you get unequally spaced energy levels.

Why does that matter? A regular circuit has evenly spaced levels. You can't use it as a qubit because when you try to flip between 0 and 1, you accidentally excite higher levels too. The Josephson junction creates anharmonicity. The gap between 0 and 1 differs from the gap between 1 and 2. This lets you address just the first two levels with microwave pulses.


The Fabrication Process

Start with silicon: High purity silicon wafer, extremely flat and clean. Any contamination creates
defects.

Deposit aluminum: Vacuum chamber, molecular beam epitaxy. Deposit 100 to 200 nanometers of aluminum in ultra-high vacuum to prevent oxidation.

Pattern the circuit: Photolithography for the basic shapes. Electron beam lithography for the junction because you need nanometer precision. An electron beam writes the pattern point by point.

Create the junction: The Dolan bridge technique works well. Evaporate aluminum at an angle, deposit the oxide barrier, evaporate more aluminum at a different angle. The two layers overlap slightly with oxide between them. The overlap area is your junction.

Getting the oxide thickness right is critical. Too thick and electrons can't tunnel. Too thin and you get leakage. You're aiming for 1 to 2 nanometers with sub-nanometer precision.

Add control circuitry: Microwave transmission lines, coupling capacitors, and resonators. The readout resonator is a microwave cavity whose frequency shifts depending on qubit state. Send in a microwave pulse, measure the reflected signal. The tiny frequency shift tells you if the qubit is 0 or 1.


Why It's Hard

Junction uniformity: A 5% variation in junction area changes qubit frequency by hundreds of megahertz. Hitting the right target across a whole chip is brutal.

Material defects: Impurities create two-level systems that absorb energy and cause decoherence. You need ultra-pure materials and ultra-clean fabrication. Cosmic rays passing through can disrupt qubits.

Yield: When IBM makes a chip with 50 qubits, maybe 30 to 40 work well. The rest have junction defects, frequency problems, or excessive noise. No way to repair a bad qubit.

Coherence times: Superconducting qubits lose their quantum state in 100 to 500 microseconds. Some designs reach milliseconds but that took years of material science improvements.


Different Designs

Transmon: Most common. Large capacitor reduces sensitivity to charge noise. Coherence around 100 microseconds. Relatively easy to make.

Flux qubit: Superconducting loop with Josephson junctions. Sensitive to magnetic flux. Harder to isolate from noise.

Fluxonium: Long chain of Josephson junctions as a superinductor. Can hit 1 millisecond coherence but harder to fabricate and control.


The Support System

Dilution refrigerator: Pumps helium-3 and helium-4 to reach millikelvin temperatures. Takes 24 hours to cool down. Costs approx. $2 million.

Microwave control: Room temperature electronics generate 4 to 8 gigahertz pulses. Signals travel down coaxial cables into the fridge. A 50 qubit chip needs 100+ cables.

Magnetic shielding: Mu-metal around the refrigerator, sometimes superconducting shields at the coldest stage. Stray fields from power lines or passing cars disrupt qubits.

Signal processing: Low-noise amplifiers, high electron mobility transistor amplifiers, fast analog-to-digital converters. Software extracts qubit states from noisy signals.


Current Performance

Best superconducting qubits today:

  • 100 to 500 microsecond coherence
  • 20 nanosecond gate operations
  • 99%+ two-qubit gate fidelity
  • 99.9%+ single-qubit gate fidelity

You can do roughly 1,000 to 10,000 operations before decoherence. Not enough for most useful algorithms yet, which need millions of operations. That's why quantum error correction is critical.


Scaling Challenges

Wiring: Can't run a million coax cables into a fridge. Need multiplexing and cryogenic control electronics inside the refrigerator.

Crosstalk: Packed qubits interfere with each other. Control signals leak to neighbors.

Uniformity: Making 1,000 nearly identical qubits pushes fabrication limits.

Materials: Better materials with fewer defects would improve coherence directly.


Five years ago, 50 microsecond coherence was state of the art. Now it's 500 microseconds. Ten years ago, chips had 5 qubits. Now they have hundreds.

The physics works. Scaling remains unsolved. I'll describe trapped ion and photonic qubits here in future posts.

Saturday, January 24, 2026

From Telecom to Quantum: Back In Springfield Where I Started

Who says you can't go back? Who says you can't go home?

- Richie Sambora, Jon Bon Jovi, John Shanks

The Economic Development Administration has designated Western Massachusetts as a Tech Hub and awarded $1 million to launch the Quantum Supply ChainAcceler (QSCA). This designation places Springfield alongside 31 other regions nationwide recognized for potential in emerging technologies. The funding supports initial development of infrastructure, partnerships, and workforce programs needed to establish the region as a quantum technology manufacturing center. Western Massachusetts joins clusters focused on semiconductors, biotechnology, and advanced materials, but stands alone in targeting quantum supply chain coordination. The award positions Springfield at the center of quantum technology infrastructure development, building on existing advanced manufacturing capabilities and educational institutions in the Pioneer Valley.

The QSCA will tackle a critical bottleneck: sourcing components for quantum computers, sensors, and networks. Right now, no coordinated supply chain exists. Companies building quantum systems struggle to find reliable suppliers for specialized parts. QSCA aims to fix that by connecting manufacturers, standardizing components, and building regional capacity.

The project involves industry partners, research institutions, and educational programs. It will create jobs in advanced manufacturing and position Massachusetts as a hub for quantum technology production, not just research.

I'm consulting on this with the Mass Tech Collaborative and Springfield Technical Community College (STCC) This takes me back to my younger days at the NSF funded National Center of Excellence – the National Center for Telecommunications Technologies at STCC from 1998 to 2014. Twenty-eight years later, I'm back working on emerging tech in Western Massachusetts and that is soooo cool. What goes around comes around.

That earlier work focused on telecommunications and emerging technologies with partners like Cisco, Microsoft and Verizon. This feels similar: building infrastructure and workforce pipelines for emerging market technologies.

The quantum industry needs workforce across the spectrum: two-year degree technicians handling equipment operation and maintenance, engineers designing systems, and PhDs advancing the science. Community colleges train the technicians. Universities provide the engineers and researchers. All are critical. You can't scale and advance quantum technology without all three.

That $1 million is seed funding. Success depends on building partnerships, developing training programs, and proving the concept works. It's early stage work with real potential impact.

Let’s Go! 

Monday, January 19, 2026

The Mathematics of an Extraordinary Life

Dad on the roof October 2024.
He has dropped the flag pole on the roof so he
could fix a pulley.
A few weeks ago I wrote about my dad passing away the day after Christmas at 94 years and 9 months. My mom, at 93 years and 6 months, is adjusting to life without him after 72 years of marriage. Over the past couple of years, she has been dealing with dementia, with my father as her primary caregiver. Both lived independently in their own house until three weeks before his death. Three weeks ago, my father was still mowing an acre of grass weekly, snowblowing a 75-foot driveway, climbing on the roof to fix siding and shingles, and caring for my mother full-time.

When processing just about anything, eventually I look at numbers. Quantifying what my father accomplished gives me a way to understand the magnitude of what we lost. The statistics don't diminish the grief; they help frame it. They tell me that what felt extraordinary to us as a family was, mathematically, exactly that extraordinary. The numbers confirm what I already knew but couldn't articulate: my parents were a statistical anomaly, and us kids have been fortunate beyond measure. 

According to Social Security Administration actuarial tables, only 10% of males born in 1931 survive to age 94. Only 20% of females born in 1932 reach 93. The probability of both reaching these ages is roughly 2%. A 72-year marriage occurs in 0.1% of marriages.

But the physical capability is where the statistics become really remarkable.

In gerontology, my parents would classify as a "super-ager." Most people see sharp decline in functional capacity starting in their 70s. Fewer than 1% of 94-year-olds perform high-intensity tasks like roof work or operate heavy machinery while serving as a full-time caregiver.

My parents operated on what geriatricians call a "squared survival curve," maintaining a high plateau of health and performing at the physical level of people 20 to 25 years younger.

The combined odds: married 72 years, both alive at these ages, both living independently, my father performing high-risk maintenance and caregiving. 

Approximately 1 in 66 million couples.

They avoided the big three killers: cardiovascular disease, cancer, and neurodegeneration. They avoided the fall cycle that typically ends independence after age 80. They avoided the widowhood effect, where survival odds drop significantly when one spouse dies.

The Gompertz-Makeham Law of Mortality states that death risk doubles every 8 years. By age 94, my father should have been at 512 times the mortality risk of someone at age 50. His physical ability suggests his biological age was closer to 74, a 20-year gap representing the maximum observed in human biology.

Genetics played a role. Continuous physical loading maintained bone density and cardiovascular health that most people lose through retirement. Managing a household kept cognitive reserve high. Being married for 72 years provided massive protection against stress and isolation.

My mother now faces life without him, navigating both grief and the progression of her dementia. But she faces it with the same extraordinary resilience that carried both of them well past the boundaries that limit most lives. And she faces it having been cared for by someone who never stopped being her partner, even when the work became harder than climbing any roof.

The numbers tell a story of statistical improbability. But behind those numbers lived a couple that never stopped moving, never stopped working, and never stopped being there. The mathematics of their lives describe a couple of outliers. My experience describes an amazing married couple of people.

Saturday, December 27, 2025

Lessons From My Father's Life

My father died yesterday. He was 94.

He worked 33 years as a telephone company lineman and repair person. He was retired 41.


He taught me the world doesn't owe you anything. But it gives you everything if you work for it.


What I learned from him:

·      Success isn't what you accumulate. It's who you help along the way.

·      Freedom isn't avoiding work. It's loving what you do so much you'd do it without pay.

·      You don't need all the answers. You need better questions.

·      The goal isn't another achievement, another title, another milestone.

·      The goal is a legacy. Something that outlasts you. Something that matters beyond yourself.


He spent much of his life teaching me in fields, streams, woods, forests, offshore in rough water, and in quiet moments watching the years pass.


I'm 68. Maybe I’ve got 26 more years of lessons if I'm lucky. 


Thanks Dad. Miss you.

Monday, December 15, 2025

Slow Connections, Fast Results: The Future of Distributed Quantum Computing

Researchers at IonQ and Aalto University have proved that multiple quantum processing units (QPUs) connected through slow interconnects can outperform single large quantum computers. This matters because building connections between quantum computers is much harder than making the computers themselves faster.

A qubit is the basic unit of quantum computing, similar to how a bit is the basic unit of regular computing. But while a regular bit is either 0 or 1, a qubit can be both 0 and 1 simultaneously until measured. This property lets quantum computers solve certain problems much faster than conventional computers.

Think of the challenge like trying to solve a puzzle. You could build one giant table and work alone, or you could connect several small tables with people working together. The catch: passing puzzle pieces between tables takes much longer than placing them on your own table.

Current quantum computer links are roughly 100 times slower than operations inside a single machine. Most experts assumed this speed gap made connected systems impractical. The IonQ and Aalto University team proved otherwise.

Their solution uses a clever technique called distributed CliNR (Clifford Noise Reduction). Instead of waiting for slow connections during the main computation, they prepare verified components in parallel on separate machines. Each quantum computer works independently on its piece, then they connect the results only when needed. This reduces both errors and total computation time.

The researchers tested their approach using 85 qubits split across four quantum computers. Even when connections were five times slower than internal operations, the distributed system beat both the direct approach and the single-machine version in speed and accuracy.

The math shows you only need modest connection speeds. For t quantum computers, you need roughly t/ln(t) parallel connections. This grows much slower than the number of machines, making the approach scalable.

Why this matters now: experimental quantum networks already exist but produce entangled pairs every 4-5 milliseconds while internal gates take microseconds. Rather than waiting for faster connections, useful multi-computer systems can now be built.

The work provides blueprints for near-term distributed quantum computers and identifies potential applications including quantum superiority experiments. These experiments demonstrate that quantum computers can solve specific problems that would take conventional supercomputers impractically long to solve, proving quantum computers have crossed a meaningful performance threshold.

This research shows that slow connections are not a dealbreaker for quantum networking.

Sunday, December 7, 2025

How Prop Betting Multiplies the House Edge

A couple of weeks ago I wrote Part 1 of a two-part series on online gambling titled Sports Betting: How the House Always Wins

I don’t gamble and never have but have friends who do. For them it’s a form of entertainment – no different than the hobbies I spend my money and time on. In this post I take a look at prop betting.

Prop bets let you wager on specific events within a game rather than the final score. Will Patrick Mahomes throw over 2.5 touchdowns? Will the first play be a run or pass? Will the game go to overtime? You bet yes or no on each proposition.

Books set a line for each prop. Player props dominate the market. Over/under on passing yards, rushing yards, receptions, touchdowns. Team props include first score type, total penalties, time of possession. Game props cover coin toss results, length of national anthem, halftime show events.

The standard vig applies: risk $110 to win $100. But prop betting multiplies the house edge through three ways that don't exist in traditional spread betting.

Take a typical player prop: Travis Kelce over/under 64.5 receiving yards. The book offers both sides at -110. Say the book's data shows Kelce has a 52% chance of going over. Fair pricing would charge less than -110 on the over side. Instead, both sides cost -110. The book wins twice: once by treating a 52-48 split like it's 50-50, and again by charging the standard vig on top.

Run the numbers: 100 bettors risk $110 on over, 100 bettors risk $110 on under. Total wagered is $22,000. Kelce finishes with 68 yards. The book pays $21,000 to over bettors. Profit is $1,000. That's the baseline 4.55% edge. But the book already knew Kelce was more likely to go over. The real edge jumps to 6% or 8% because they priced it wrong on purpose.

Information gaps widen this advantage. Books track snap counts, matchup data, and injury reports you don't see. They price props based on information you can't access. Fewer bettors can spot bad lines.

Correlation multiplies losses. Mahomes passing yards and Chiefs total points move together. Books let you parlay both and pay like they're unrelated. They're not. Each linked prop increases the edge.

Volume drives profit. A single NFL game offers 200+ prop bets versus 10 to 15 traditional bets. More bets mean more vig collected. Books don't balance action on props. They set wide margins and accept the risk because the edge covers losses.

Books win 55% to 60% on props versus 52% to 53% on spreads. You still need 52.38% accuracy to break even. Props make that harder while feeling easier. Books promote props because the math works better for them.