top of page
lgo-Speedata-v02-0.png

See How Speedata Performs on Your Workloads

Test the Speedata Workload Analyzer with your Spark logs. It's free, secure, and runs in your environment.

A Case Study - Post-Training Scheme for AI Assisted Chip Verification: UART-to-AXI Block 

  • Writer: Adi Fuchs
    Adi Fuchs
  • Mar 17
  • 2 min read
AI-Assisted UART-to-AXI Verification Environment: A Speedata Case Study
AI-Assisted UART-to-AXI Verification Environment: A Speedata Case Study

Verification remains the biggest bottleneck in chip development, often consuming 70% of the project cycle. While Generative AI promises to accelerate this, the 'garbage in, garbage out' rule still applies. To move past the hype, we show here  a controlled case study using a UART-to-AXI bridge to define exactly how AI agents should (and shouldn't) be embedded in a UVM workflow. We also made similar exploration on more than a dozen complex proprietary design blocks.


To demonstrate the embedding of AI agents in verification, we’ll use a real case study. 

We used a “UART to AXI” block as the first case study for bringing AI into VLSI development in a controlled, low-risk way. One of the first lessons was that “just point the AI at the repo” isn’t enough. UART and AXI are widely used, well-understood protocols, but the way a team structures environments, names components, runs regressions, and encodes assumptions is often internal and nuanced.


To bridge this gap, we developed a dedicated methodology guide. This Markdown-based "source of truth" captured our internal standards:


  • Preferred directory layout

  • Expected UVM component layering 

  • Strict definitions for scoreboard "completion"

  • How to handle synchronization/timeouts

  • Documenting spec details (e.g., transitioning to 5-byte addressing and clarifying metadata-driven port selection).


With that internal methodology written down in a single, reviewable place, the AI could reliably scaffold the UVM project (simulation harness, VE package, test package), propagate spec changes across all relevant files, and keep the documentation aligned, while engineers stayed in control of verification intent and sign-off criteria.

We then turned this environment into a "confidence story" via a purposeful test ladder. We focused on end-to-end correlation: ensuring UART requests mapped exactly to AXI activity and that the UART response matches the design.


That guided both the scoreboard philosophy (treat unexpected bus activity as a real bug, verify writes with address+data, and always compare what comes back to what was asked) and the test strategy.


We built:

  • A sanity test to validate the basic flow

  • A back-to-back test to stress the “send the next command immediately when the previous response completes” behavior

  • A data-integrity test to prove the design’s routing decisions (e.g., metadata-driven port selection) don’t cross-contaminate state across ports.


The broader lesson from Speedata’s case study is that AI fits best when it accelerates the process: scaffolding, consistency, documentation, and predictable test structure, while engineers retain ownership of what matters most,  the verification intent, the pass/fail criteria, and the confidence story you can explain to anyone outside the team.



bottom of page