For an industry that invests so heavily in verification, semiconductor teams still find far too many important bugs later than they should.
That is no small problem. Late bug discovery is expensive in every way that matters. It burns engineering time, consumes enormous compute capacity, delays schedules, and erodes confidence when programs are supposed to be converging. In the latest Wilson Research IC/ASIC study, 75% of projects were behind schedule, while verification engineers reported spending 47% of their time on debug. When bugs are found late, debug becomes slower, broader, and harder to predict, which increases schedule pressure just when teams have the least room to absorb it.
Simulation is a big part of the story. Simulation workloads consume an estimated 70% to 80% of design compute capacity. Return on that investment diminishes as runs become increasingly repetitive with lower coverage and bug discovery. The industry is spending a great deal of effort trying to find problems through methods that become less efficient as complexity rises.
That would be easier to accept if there were no practical way to change the economics of verification. But there is. Formal verification is not a niche science project anymore. It is in real use across the industry, and adoption is growing. In the 2020 Wilson Research IC/ASIC study, about 42% of projects reported using formal property checking and about 35% reported using automatic formal applications. The 2024 study shows property-checking adoption growing at about 5.8% annually, with automatic formal applications growing at about 8.7%.
And yet formal is still not used nearly as broadly or as strategically as today’s design complexity would justify. That gap was easier to explain ten years ago. It is getting harder to explain now.
Designs are more concurrent. Architectures are more heterogeneous. Interactions across IP, firmware, and system behavior are increasingly difficult to understand and verify through manual review. Safety, security, and reliability requirements add significant design and verification complexity where traditional approaches are not always sufficient. Under those conditions, it becomes increasingly difficult to argue that long regression cycles and probabilistic confidence alone are enough. The industry has already accepted that complexity is breaking old assumptions in many parts of the flow. Verification should not be the exception.
Part of the problem is cultural. Many organizations still treat formal verification as a specialist technique rather than as an engineering capability. Some teams assume it is too hard to adopt, too narrow to matter, or too dependent on experts to scale. Others associate it with heroic proof efforts that are difficult to converge and even harder to maintain. Those objections are becoming outdated. Formal has become more practical over time, especially through focused formal applications that package model-checking power into narrower, workflow-friendly use cases. Better visualization, better debug, and more task-specific flows are steadily lowering the barrier to entry.
Another problem is how formal gets framed. Too often it is discussed as an alternative to simulation, as though the choice is either formal or regression. That is the wrong comparison. Formal and simulation are complementary techniques. Formal can prove critical properties and expose coverage gaps that simulation might miss, while simulation remains essential for broader dynamic behavior and system scenarios. The real issue is that many teams still rely too heavily on simulation for problems that it is poorly suited to.
This matters because the cost of underusing formal is not just technical. It becomes organizational. When teams cannot establish enough confidence early, they compensate with more regressions, more debug, more schedule contingency, and more human coordination. The result is familiar: late surprises, defensive engineering, and expensive cycles spent trying to raise confidence after the design has solidified.
It is also worth being precise about what formal verification can and cannot do. In the property-checking sense, formal does not prove that nothing could ever be wrong. What it can do is prove that a design satisfies the properties you specify, under the assumptions and abstractions you define. That distinction matters. Property quality matters. Methodology matters. In some cases, specialized expertise still matters, although more packaged formal applications are making the technology easier to adopt.
In practice, the teams getting the most value from formal are applying it deliberately to the parts of the design where exhaustive analysis changes outcomes the most. DVCon sessions and industry papers reflect that reality. Formal is being used on NoCs, RISC-V, low-power verification, arithmetic blocks, security scenarios, ordering problems, and other focused challenges that matter in real commercial designs. The 2024 Wilson study shows how relevant that is becoming: 58% of projects now incorporate a RISC-V processor and 59% incorporate some type of AI accelerator, both doubling over the prior study.
That should be the real lesson for engineering leaders. The question is no longer whether formal methods are real. They are. The question is whether your verification strategy reflects the actual complexity and risk profile of the products you are building. If the answer is still mostly “more regressions,” then the burden of proof may be shifting in the wrong direction.
So why are we still finding so many bugs so late?
In many cases, it is because we are still investing most heavily in traditional methods that find problems only after enormous time, compute, and coordination have already been spent. As design complexity rises, formal needs to be treated as a core part of the verification toolkit.
The teams that make progress here will be the ones that adopt formal pragmatically, target the right problems, and build the internal confidence and methodology to use it where it changes outcomes. The challenge is not getting access to a formal engine. It is knowing how to apply formal well enough that it becomes part of an effective verification strategy.
| |
Brandon Meredith is a Technical Solutions Consultant at AI TechSales Inc. with nearly three decades of experience in the semiconductor industry across engineering, infrastructure, methodologies, requirements, and operational transformation. He helps semiconductor organizations leverage powerful new AI-era solutions to solve critical engineering and operational challenges. |