Design Flow

Delivering IP: A Three-Pronged Attack

Beyond good design and architecture, strong procedures for revision control and release management and effective techniques for reproducing customers' problems are needed to overcome a multiplicity of challenges.

by R. Mark Gogolewski


Denali Software provides memory simulation models and tools. Our product, Memorymodeler, covers both discrete and embedded memory components architected as class-based, parameterized C simulation cores. These cores model DRAM, EDO (extended-data-out) DRAM, synchronous DRAM, synchronous graphics RAM, DDR (dual-data-rate) SDRAM, synchronous SRAM, flash, synchronous EPROM, FIFO, PROM, and Rambus DRAM for all commercial Verilog and VHDL simulators on all major platforms.

We have the usual challenges of any engineering design team working under strong time-to-market pressures. As an IP provider, we also face the challenges presented by the need to support a wide array of models in an aggressive market, which are compounded by the necessity to support multiple simulation environments on multiple platforms.

We tackle these challenges on three fronts: the design and architecture of our models and system, the procedures for revision control and release management, and the techniques for reproducing the customers' problems. The methodology we developed has worked remarkably well, and a key part of its success has been the SOS revision control tool. As a result, we're able to create very high quality releases quickly and with confidence.

The lessons we have learned and the techniques we employ should work well not only for IP developers, who will encounter the same challenges, but for designers in general as well.

The challenges
If one word can serve as a summary of most of our challenges, it is "multiple." First, of course, we support several core classes of memories, each of which must cover the many variations in vendors' features, functionality, timing, and naming conventions. Then, too, we must support all the major VHDL and Verilog simulators, plus cycle-accurate and hardware/software codesign environments: Fusion, Leapfrog, Modelsim, NC, Optium, Polaris, QuickVHDL, Seamless, Speedsim, VCPU, VCS, Verilog-XL, VSS, and others. Furthermore, we support the simulators on all the major Unix platforms and on Windows NT and Windows 95.

In addition, as with most design teams, multiple engineers are often working together on the same technology. Moreover, we often have several major feature enhancements in development at the same time, and frequently parallel projects need to modify some of the same source code. So projects often overlap the source, and engineers often overlap projects.

Beyond the "multiple" issues, we face another set of challenges. We have to get new features and models to market quickly in response to new memory products. Thus we need to produce frequent releases to keep pace with new memory technology.

We also have to maintain quality. High quality not only satisfies customers, it's smart for business. High-quality software reduces support costs and keeps developers focused on development.

Finally, we must provide fast, quality support. We have to be able to diagnose a customer's problem and respond as soon as possible. The greatest challenge in doing that is reproducing the customer's environment.

Architecture
Writing a very specific model for a specific device in a single language is the most expedient and most efficient way to produce a single model. However, if you want to support multiple sets of functionally similar parts, that approach will result in replication of code and increased support and maintenance costs. On the other hand, a completely generalized model to cover all cases would be unrealistic to write and is likely to be more inefficient and complex. In between those two extremes lies the proper balance of specificity and generalization.

Upon investigation, we realized that for each class of memory the members share similar functionality, and we chose to develop one model for each set of functionally similar memories (SRAM, for example). Within each class model, we parameterized the differences (bus sizes, timing, pin names, command coverage) such that it covers all available parts.

Additionally, all our memory models share some of the same basic types of operations, such as loading and saving data from files, licensing, logging statistical information, and communicating with analysis tools. Therefore, we created different low-level utility packages to perform these operations to be shared by all the models.

Since our models are written in C, we communicate with each simulator through its foreign language interface. However, although PLI is a standard for Verilog simulators, there are often minor differences in interpretation and implementation, and for VHDL no such standard currently exists, so that each simulator has its own distinct and different interface. We insulated our models from all these variations by creating a simulation interface layer.

For each class of memory, there are variations in functionality, timing, naming, and so on, depending on the manufacturer. A customer may be using memory from one manufacturer now, yet may switch to a different manufacturer later.

Accommodating variations
The core model for each class accommodates those variations by reading a specification file, called a SOMA (Specification of Memory Architecture) file, generated by our interactive tool, Memmaker (see Figure 1). Using Memmaker, customers can select the relevant class of memory and quickly customize the parameters of the model. Memmaker then generates the specification file used by the core model and generates the Verilog or VHDL shell that's added to the design. With this configuration, we're able to support many different variations and provide the flexibility to the customers to generate their own variation.

Figure 1

Specification-based modeling environment


A specification-based system supports discrete and embedded memory models in all VHDL and Verilog environments. Designers use Memmaker, a graphical tool, to create Specification of Memory Architecture (SOMA) files which are used at simulation initialization to define the memory simulation behavior.

With this approach, a single SOMA file represents the same discrete part or a hard core in all simulation environments. We've also achieved additional reuse of our code by having all core models built with the same underlying set of utilities. This reuse of code naturally leads to greater quality and consistency.

As one might imagine, we have a concurrent development schedule, and we must track numerous platforms and releases. To ensure that we could quickly release models and maintain the highest quality, we adopted a robust revision control and release management strategy.

Selecting a management tool
In selecting a tool we had a few basic requirements. One was ease of administration and use. We wanted to focus our engineering resources on designing our models and software--not on writing scripts, tinkering with tools, or managing repositories.

Another was revision control of both files and directories. We knew that we'd need to change file names and directory structures. We had to ensure that we could always re-create any release with the original hierarchy. Without this ability, previous makefiles wouldn't work, and it would be impossible to re-create a release quickly and with confidence.

Concurrent development support also was important. With multiple features under development at the same time, we had to be able to ship product with a subset of projects completed.

We quickly rejected public-domain revision control tools like RCS and SCCS. These tools provide revision control for files but do little to help you manage a whole project with a dynamic directory hierarchy. We would have been faced with developing numerous scripts to achieve project control and still fall short on our requirement to have revision control of directories. CVS, another free tool from GNU, provided features for project control but was designed specifically for concurrent development in which releases are made after all enhancements, in progress are completed. We finally settled on using SOS from ClioSoft. SOS not only met all our basic requirements, it also has some unique features that are very useful.

First, we made sure that every source file and directory was placed under SOS control. Source files aren't just the core model files, but also all the related files, like test benches, stimulus files, makefiles, and scripts. To make a patch to a prior release, we must be able to re-create it exactly, rebuild it with the fix, and rerun all tests before shipping the product.

We decided that any major new feature development would be done on its own branch. For example, assume that we're adding a history logging feature at the same time as a debugging feature to peak and poke into memory locations. Different engineers may be working on the two features, and one engineer may be working on both features. One feature may be completed first, and we may need to release the new functionality before completing the other. Creating a separate branch of development lets us cope with all those issues easily. We define branches called history_log and peak_and_poke and use them independently for code development.

All releases are made using revisions from the main line of development only. If one feature is completed and tested, all the changes on the corresponding branch are merged back on to the main line of development (see Figure 2). All the tests are run again using the merged revisions on the main line of development, and we are ready to create a release even when other development is in progress.

Figure 2

Branching and merging


SOS from ClioSoft is used to easily branch revisions from the main line of development. It is also used to merge these independent branches back into the main development.

When a new release is made, we take a "snapshot" of the project tree with SOS, which records the revisions of all the files and directories relevant for the release. Later, if changes need to be made to this release, we use SOS to re-create the release tree using the snapshot. Again, we use a branch from versions used in the release to isolate those changes from ongoing development. After the change is tested, the branch is merged into the main line of development.

SOS has a file browser­like display to show the project tree. A file can be further expanded to expose all the current revisions. This capability makes it very easy to see the revision history of the files. The graphical user interface is intuitive and requires little training.

Each engineer works in a unique "work area," or "sandbox." SOS creates the project hierarchy in the work area and populates the directories with the specified revision of each file. This setup effectively isolates each engineer from other team members who may be making changes. The engineer can then control when and how the work area is updated.

To create a work area, you specify a rule to select the revision of each file to be selected. For example, if you're working with the peak-and-poke debugging feature, then your revision search rule could be peak_and_poke, main. If a file has a peak_and_poke branch, then SOS will pick the latest revision on that branch for your work area. If it doesn't have that branch, it will pick the latest revision on the main line of development. If you were also working on the history-logging project, then you would create a separate work area with a revision search rule of history_log, main.

SOS uses five alert icons, which are displayed adjacent to the file in question and are propagated up the directory hierarchy. They alert you to situations like a file being checked out or an unmerged branch on a file (see Figure 3). The icons looked interesting when we first started using SOS, but they've proven invaluable during the course of development. They've helped us avoid errors like forgetting to check in all the files, which is unbelievably disruptive to the team. In particular, the unmerged branch icon has been a huge time saver, because it makes sure that all the files that were branched to implement a feature actually are merged back into the main line.

Alert icons

 


SOS gives graphical indications of the state of version control. Here the "checked out" icon (resembling a leaf) reminds the developer that the file flash.c is currently checked out for editing.

Tracing and tests
We interviewed many designers and asked them to pinpoint the biggest obstacle for success with commercial models. Unacceptable turnaround time on support was one of the issues that came up very frequently. Although turnaround time can reflect the vendor's commitment to providing high-quality support, we discovered a fundamental structural problem with the mechanism for providing support for the software. We found that on discovering a problem, the designer typically had no means of precisely communicating the problem to the customer support staff. The designer would typically waste days isolating a problem or going through a lot of effort transferring the design over to the vendor. In these days of highly compressed time-to-market requirements, all that wasted time is unacceptable.

We've implemented a "tracing" technology whereby we can capture all the events on the boundary of the model. When a designer encounters a problem with the model, he or she reruns the simulation with the tracing enabled. This procedure will lead to a file being generated that will not only capture all the events on the boundary, but all the events scheduled by the model as well. The customer then e-mails the trace file and the specification file for the model to us. We have tools that convert the files into a Verilog or VHDL test bench. Using the test bench, we can completely reproduce the customer's problem within half an hour of the customer's discovering the problem, regardless of where the customer is located.

Verification
In a striking parallel between software and hardware design, our model class development process requires a rigorous verification process. We quickly realized that developing or obtaining large sets of internal test suites in order to cover specific parts and functionality solved only a subset of the verification issues. Furthermore, the long-term goal of covering a broad spectrum of behav-ior makes individual tests, though efficient to create, impractical.

As with our core models, we chose a class-based approach and created Autotest, a set of class-based, parameterized tests. In addition to generating the SOMA parameterization file and the Verilog or VHDL shell file (or both) for the memory model, as an option Memmaker generates a test model instantiation and a top-level component that wires the model and the test components together. Both the Autotest component and the memory component are then parameterized by the same SOMA specification.

When simulated, the test component will drive the memory model through a sequence of tests parameterized for function and timing. The Autotest class-based models track the memory model's output and input pins for proper feedback, verify that all timing checks are correctly caught, and flag any unexpected behavior. At the end of simulation, Autotest generates a summary of the passed and failed tests.

R. Mark Gogolewski is the vice president of engineering and co-founder of Denali Software, Inc. in Palo Alto, Calif. In addition to discrete memory coverage, his current responsibilities include retargeting the company's class-based C modeling technology for embedded component simulation. Previously, he worked at Vantage Analysis Systems, where he was responsible for graphical tools development.

To voice an opinion on this or any Integrated System Design article, please email your message to miker@isdmag.com.


integrated system design  June 1998




[Articles from Integrated System Design Magazine] [ICs and uPs]
[Custom ICs and Programmable Logic] [Vendor Guide]
[Design and Development Tools] [Home]


For more information about isdmag.com email webmaster@isdmag.com
For advertising information email vwestphal@isdmag.com
Comments on our editorial are welcome.
Copyright © 1998 Integrated System Design Magazine