<img width=“1639” height=“685” src=“https://www.edn.com/wp-content/uploads/Figure-2_-YAML-to-EDA-Tool-Bridge.png?fit=1639%2C685” class=“webfeedsFeaturedVisual wp-post-image” alt=“” style=“display: block; margin-bottom: 5px; clear:both;max-width: 100%;” link_thumbnail=“” decoding=“async” loading=“lazy” srcset=“https://www.edn.com/wp-content/uploads/Figure-2_-YAML-to-EDA-Tool-Bridge.png?w=1639 1639w, https://www.edn.com/wp-content/uploads/Figure-2_-YAML-to-EDA-Tool-Bridge.png?w=300 300w, https://www.edn.com/wp-content/uploads/Figure-2_-YAML-to-EDA-Tool-Bridge.png?w=768 768w, https://www.edn.com/wp-content/uploads/Figure-2_-YAML-to-EDA-Tool-Bridge.png?w=1024 1024w, https://www.edn.com/wp-content/uploads/Figure-2_-YAML-to-EDA-Tool-Bridge.png?w=1536 1536w” sizes=“auto, (max-width: 1639px) 100vw…

Automation has become the backbone of modern SystemVerilog/UVM verification environments. As designs scale from block-level modules to full system-on-chips (SoCs), engineers rely heavily on scripts to orchestrate compilation, simulation, and regression. The effectiveness of these automation flows directly impacts verification quality, turnaround time, and team productivity.
For many years, the Makefile has been the tool of choice for managing these tasks. With its rule-based structure and wide availability, Makefile offered a straightforward way to compile RTL, run simulations, and execute regressions. This approach served well when testbenches were relatively small and configurations were simple.
However, as verification complexity exploded, the limitations of Makefile have become increasingly apparent. Mixing execution rules with hardcoded test configurations leads to fragile scripts that are difficult to scale or reuse across projects. Debugging syntax-heavy Makefiles often takes more effort than writing new tests, diverting attention from coverage and functional goals.
These challenges point toward the need for a more modular and human-readable alternative. YAML, a structured configuration language, addresses many of these shortcomings when paired with Python for execution. Before diving into this solution, it’s important to first examine how today’s flows operate and where they struggle.
Current scenario and challenges
In most verification environments today, Makefile remains the default choice for controlling compilation, simulation, and regression. A single Makefile often governs the entire flow—compiling RTL and testbench sources, invoking the simulator with tool-specific options, and managing regressions across multiple testcases. While this approach has been serviceable for smaller projects, it shows clear limitations as complexity increases.
Below is an outline of key challenges.
- Configuration management: Test lists are commonly hardcoded in text or CSV files, with seeds, defines, and tool flags scattered across multiple scripts. Updating or reusing these settings across projects is cumbersome.
 - Readability and debugging: Makefile syntax is compact but cryptic, which makes debugging errors non-trivial. Even small changes can cascade into build failures, demanding significant engineer time.
 - Scalability: As testbenches grow, adding new testcases or regression suites quickly bloats the Makefile. Managing hundreds of tests or regression campaigns becomes unwieldy.
 - Tool dependence: Each Makefile is typically tied to a specific simulator, for instance, VCS, Questa, and Xcelium. Porting the flow to a different tool requires major rewrites.
 - Limited reusability: Teams often reinvent similar flows for different projects, with little opportunity to share or reuse scripts.
 
These challenges shift the engineer’s focus away from verification quality and coverage goals toward the mechanics of scripting and tool debugging. Therefore, the industry needs a cleaner, modular, and more portable way to manage verification flows.
Makefile-based flow
A traditional Makefile-based verification flow centers around a single file containing multiple targets that handle compilation, simulation, and regression tasks. See the representative structure below.

This approach offers clear strengths: immediate familiarity with software engineers, no additional tool requirements, and straightforward dependency management. For small teams with stable tool chains, this simplicity remains compelling.
However, significant challenges emerge with scale. Cryptic syntax becomes problematic; escaped backslashes, shell expansions, and dependencies create arcane scripting rather than readable configuration. Debug cycles lengthen with cryptic error messages, and modifications require deep Maker expertise.
Tool coupling is evident in the above structure—compilation flags, executable names, and runtime arguments are VCS-specific. Supporting Questa requires duplicating rules with different syntax, creating synchronization challenges.
So, maintenance of overhead grows exponentially. Adding tests requires multiple modifications, parameter changes demand careful shell escaping, and regression management quickly outgrows Maker’s capabilities, forcing hybrid scripting solutions.
These drawbacks motivate the search for a more human-readable, reusable configuration approach, which is where YAML’s structured, declarative format offers compelling advantages for modern verification flows.
YAML-based flow
YAML (YAML Ain’t Markup Language) provides a human-readable data serialization format that transforms verification flow management through structured configuration files. Unlike Makefile’s imperative commands, YAML uses declarative key-value pairs with intuitive indentation-based hierarchy.
See below this YAML configuration structure that replaces complex Makefile logic:


The modular structure becomes immediately apparent through organized directory hierarchies. As shown in Figure 1, a well-structured YAML-based verification environment separates configurations by function and scope, enabling different team members to modify their respective domains without conflicts.

Figure 1 The block diagram highlights the YAML-based verification directory structure. Source: ASICraft Technologies
Block-level engineers manage component-specific test configurations, IP1 andIP2, while integration teams focus on pipeline and regression management. Instead of monolithic Makefiles, teams can organize configurations across focused files: build.yml for compilation settings, sim.yml for simulation parameters, and various test-specific YAML files grouped by functionality.
Advanced YAML features like anchors and aliases eliminate configuration duplication using the DRY (Don’t Repeat Yourself) principle.

Tool independence emerges naturally since YAML contains only configuration data, not tool-specific commands. The same YAML files can drive VCS, Questa, or XSIM simulations through appropriate Python parsing scripts, eliminating the need for multiple Makefiles per tool.
Of course, YAML alone doesn’t execute simulations; it needs a bridge to EDA tools. This is achieved by pairing YAML with lightweight Python scripts that parse configurations and generate appropriate tool commands.
Implementation of YAML-based flow
The transition from YAML configuration to actual EDA tool execution follows a systematic four-stage process, as illustrated in Figure 2. This implementation addresses the traditional verification challenge where engineers spend excessive time writing complex Makefiles and managing tool commands instead of focusing on verification quality.

Figure 2 The YAML-to-EDA phase bridges the YAML configuration. Source: ASICraft Technologies
YAML files serve as comprehensive configuration containers supporting diverse verification needs.
- Project metadata: Project name, descriptions, and version control
 - Tool configuration: EDA tool selection, licenses, and version specifications
 - Compilation settings: Source files, include directories, definitions, timescale, and tool-specific flags
 - Simulation parameters: Tool flags, snapshot paths, and log directory structures
 - Test specifications: Test names, seeds, plusargs, and coverage options
 - Regression management: Test lists, reporting formats, and parallel execution settings
 

Figure 3 Here is a view of Python YAML parsing workflow phases. Source: ASICraft Technologies
The Python implementation demonstrates the complete flow pipeline. Starting with a simple YAML configuration:

The Python script loads and processes the configuration below:

When executed, the Python script produces clear output, showing the command translation, as illustrated below:

The complete processing workflow operates in four systematic phases, as detailed in Figure 3.
- Load/parse: The PyYAML library converts YAML file content into native Python dictionaries and lists, making configuration data accessible through standard Python operations.
 - Extract: The script accesses configuration values using dictionary keys, retrieving tool names, file lists, compilation flags, and simulation parameters from the structured data.
 - Build commands: The parser intelligently constructs tool-specific shell commands by combining extracted values with appropriate syntax for the target simulator (VCS or Xcelium).
 - Display/execute: Generated commands are shown for verification or directly executed through subprocess calls, launching the actual EDA tool operations.
 
This implementation creates true tool-agnostic operation. The same YAML configuration generates VCS, Questa, or XSIM commands by simply updating the tool specification. The Python translation layer handles all syntax differences, making flows portable across EDA environments without configuration changes.
The complete pipeline—from human-readable YAML to executable simulation commands—demonstrates how modern verification flows can prioritize engineering productivity over infrastructure complexity, enabling teams to focus on test quality rather than tool mechanics.
Comparison: Makefile vs. YAML
Both approaches have clear strengths and weaknesses that teams should evaluate based on their specific needs and constraints. Table 1 provides a systematic comparison across key evaluation criteria.

Table 1 See the flow comparison between Makefile and YAML. Source: ASICraft Technologies
Where Makefiles work better
- Simple projects with stable, unchanging requirements
 - Small teams already familiar with Make syntax
 - Legacy environments where changing infrastructure is risky
 - Direct execution needs required for quick debugging without intermediate layers
 - Incremental builds where dependency tracking is crucial
 
Where YAML excels
- Growing complexity with multiple test configurations
 - Multi-tool environments supporting different simulators
 - Team collaboration where readability matters
 - Frequent modifications to test parameters and configurations
 - Long-term maintenance across multiple projects
 
The reality is that most teams start with Makefiles for simplicity but eventually hit scalability walls. YAML approaches require more expansive initial setup but pay dividends as projects grow. The decision often comes down to whether you’re optimizing for immediate simplicity or long-term scalability.
For established teams managing complex verification environments, YAML-based flows typically provide better return on investment (ROI). However, teams should consider practical factors like migration effort and existing tool integration before making the transition.
Choosing between Makefile and YAML
The challenges with traditional Makefile flows are clear: cryptic syntax that’s hard to read and modify, tool-specific configurations that don’t port between projects, and maintenance overhead that grows with complexity. As verification environments become more sophisticated, these limitations consume valuable engineering time that should focus on actual test development and coverage goals.
The YAML-based flows address these fundamental issues through human-readable configurations, tool-independent designs, and modular structures that scale naturally. Teams can simply describe verification intent—run 100 iterations with coverage—while the flow engine handles all tool complexity automatically. The same approach works from block-level testing to full-chip regression suites.
Key benefits realized with YAML
- Faster onboarding: New team members understand YAML configurations immediately.
 - Reduced maintenance: Configuration changes require simple text edits, not scripting.
 - Better collaboration: Clear syntax eliminates the “Makefile expert” bottleneck.
 - Tool flexibility: Switch between VCS, Questa, or XSIM without rewriting flows.
 - Project portability: YAML configurations move cleanly between different projects.
 
The choice between Makefile and YAML approaches ultimately depends on project complexity and team goals. Simple, stable projects may continue benefiting from Makefile simplicity. However, teams managing growing test suites, multiple tools, or frequent configuration changes will find YAML-based flows providing better long-term returns on their infrastructure investment.
Meet Sangani is ASIC verification engineer at ASICraft Technologies.
Hitesh Manani is senior ASIC verification engineer at ASICraft Technologies.
Shailesh Kavar is ASIC verification technical manager at ASICraft Technologies.
Related Content
- Addressing the Verification Bottleneck
 - Making Verification Methodology and Tool Decisions
 - Gate level simulations: verification flow and challenges
 - Specifications: The hidden bargain for formal verification
 - Shift-Left Verification: Why Early Reliability Checks Matter
 
The post Makefile vs. YAML: Modernizing verification simulation flows appeared first on EDN.