System emulation technology enables users to rapidly prototype HDL-based designs on programmable hardware before committing their designs to production. In this paper, we show that this methodology is particularly viable for at-speed verification of communications designs.We show how complimentary tools can be used to generate HDL code for a digital wireless communications system. Then we consider how this design can be implemented and evaluated using a rapid prototyping methodology. Synthesis produces a gatelevel specification for implementation in FPGAs which are combined with other components in the Aptix rapid prototyping environment. Issues discussed include: software tool flows, prototype hardware assembly, system performance, and debugging. Rapid prototypingPrototyping has long been used as a method to validate designs before committing them to production. As designs grew larger and more complex, the time to build and debug the prototype increased to impractical levels. As a result, the emphasis shifted to validating the design with simulation. However, system-level simulation is limited by the availability of software models and the difficulty of modeling real-world stimuli. There is a compelling need for a verification strategy that combines the ease-of-use of simulation and the realism of a prototype.With the advent of synthesis, it is now feasible to create technology-independent descriptions of a systemlevel design, emulate the design in FPGAs, then re-target the design to the selected technology when the design is stable. When coupled with Aptix's proprietary FPIC™ and FPCB™ technologies and development software, it is now possible to build a flexible, easy-to-change prototype in a matter of hours. This enables a rapid prototyping verification methodology that combines the advantages of prototyping with the flexibility of simulation.In simulation the network is verified using software models and a software algorithm. Execution is done using workstation CPU processors or using acceleration methods. Simulation can be done both at the ASIC and system netlist level. Execution speed depends on the level of abstraction of the simulation models. Behavioral models allow for higher level (system) simulation, but they are often difficult to create for complex ASICs. Generally, simulation allows observation of only a small fraction of real world interactions due to vast amounts of data and run-time overhead.An alternative to simulation is emulation. In emulation the network is verified by re-targeting all or parts of it to a different technology, such as FPGAs. Verification takes place in a hardware environment, rather than in the virtual environment of a simulator. In ASIC emulation, the ASIC netlist is re-targeted from the ASIC vendor's library to a different technology, typically an array of FPGAs. Software tools and dedicated hardware have been developed in the aim of automating this re-targeting process. Once the design has been re-targeted, the ASIC emulation hardware can be exercised by the designer's si...
Over the recent past, the focus of EDA tools has been on tackling the chip issues. Today, system design issues are emerging as the top concerns of design teams. The issues include the inability to simulate at the system level, incomplete and inaccurate system specs, and immature hardware/software co-design. Co-design is an essential element of new design methodologies for system development:• ratios of seven software engineers per hardware engineer are now common in the development of complex embedded systems; • errors cost 10 times more to correct at the integration stage than during design; • system specifications change impacting both hardware and software development; • interfaces between hardware and software are non-trivial for system design. A comprehensive hardware/software co-design methodology needs to deliver a unified implementation environment addressing the issues outlined above. What kind of tools can meet this challenge? Are EDA vendors close to providing the tools that the designers need? Brian Bailey, Mentor Graphics, Wilsonville, OROver the past few years a number of commercial products have been introduced that address various parts of the system verification problem. The recent introductions have focused on the hardware/software boundary. These tools are a start, but the field is very much in its infancy. Today's tools are limited to a single abstraction of software and in most cases a single hardware simulation session. This makes the tools useful as an implementation verification solution and enables the notion of a virtual hardware prototype upon which we can perform software verification. It does not make them useful for higher level design or concept validation. This requires tools that are much more flexible and can cross many abstraction domains. At the same time, these hardware/software simulators must join with other parts of the system verification problem to create a complete verification solution. As with all simulation solutions, models are a key issue, both in terms of availability and accuracy. It appears as if many core providers are stepping up to the challenge of providing these simulation models for processors or are willing to assist in the validation of models. This will be a key step in the wide-spread acceptance of these tools. Kurt Keutzer, Synopsys, Mountain View, CAThere is no single hardware-software co-simulation problem and as a result there is no single tool or technology that can successfully solve all the problems associated with co-verification of hardware and software. Each developer must trade off the desired performance against the level of accuracy. Accuracy itself has two aspects. The first is the timing accuracy. The second might be called veracity, or the extent to which the model accurately models the design under verification. Once these parameters are fixed then the remaining challenge is to determine which solution comes closest to meeting the requirements at a cost, in time and money, that one is willing to pay. For hardware designers the trade-off...
System emulation technology enables users to rapidly prototype HDL-based designs on programmable hardware before committing their designs to production. In this paper, we show that this methodology is particulady viable for at-speed verification of communications designs.We show how complimentary tools can be used to generate HDL code for a digital wireless communications system. Then we consider how this design can be implemented and evaluated using a rapid prototyping methodology. Synthesis produces a gate-level specification for implementation in FPGAs which are combined with other components in the Aptix rapid prototyping environment. Issues discussed include: software tool flows, prototype hardware assembly, system performance, and debugging. Rapid PrototypingPrototyping has long been used as a method to validate designs before committing them to production. As designs grew larger and more complex, the time to build and debug the prototype increased to impractical levels. As a result, the emphasis shifted to validating the design with simulation. However, system-level simulation is limited by the availability of software models and the difficulty of modeling real-world stimuli. There is a compelling need for a verification strategy that combines the ease-of-use of simulation and the realism of a prototype.With the advent of synthesis, it is now feasible to create technology-independent descriptions of a systemlevel design, emulate the design in FPGAs, then re-target the design to the selected technology when the design is stable. When coupled with Aptix's proprietary FPICTM and FPCBTM technologies and development software, it is now possible to build a flexible, easy-to-change prototype in a matter of hours. This enables a rapid prototyping verification methodology that combines the advantages of prototyping with the flexibility of simulation.In simulation the network is verified using software models and a software algorithm. Execution is done using workstation CPU processors or using acceleration methods. Simulation can be done both at the ASIC and system netlist level. Execution speed depends on the level of abstraction of the simulation models. Behavioral models allow for higher level (system) simulation, but they are often difficult to create for complex ASICs. Generally, simulation allows observation of only a small fraction of real world interactions due to vast amounts of data and run-time overhead.An alternative to simulation is emulation. In emulation the network is verified by re-targeting all or parts of it to a different technology, such as FPGAs. Verification takes place in a hardware environment, rather than in the virtual environment of a simulator. In ASIC emulation, the ASIC netlist is re-targeted from the ASIC vendor's library to a different technology, typically an array of FPGAs. Software tools and dedicated hardware have been developed in the aim of automating this re-targeting process. Once the design has been re-targeted, the ASIC emulation hardware can be exercised by the designer's ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.