Optimizing the Embedded Software Design Process
Many tools and techniques are available to help reduce time and cost during the product development cycle and can be applied regardless of the software methodology used.
For the purposes of this article, we'll define time to market as the time it takes to get from product specification to the product shipping in full volume production. Volume production for device manufacturers is kind of like Black Friday for the retail industry. It's where we finally turn the corner from spending (and losing) money on a product to making money on it. By using this definition, we can put a stop to all those cheaters who ship buggy products just to make a release date. This clearly ends up causing a lot more harm than good. It delays or saps profitability from the project and, even worse, it can destroy your company's reputation. We don't want to sacrifice quality in our search for speed.
Ok, now that we've established the ground rules, let's talk about the major factors that delay embedded software projects.
While issues related to requirements should not be overlooked, addressing these issues will undoubtedly lead into arguments about methodologies, and that is outside the scope of what will be covered here. The only reason that incomplete testing makes the list is because it allows bugs to slip through the process undetected. That leaves us with software defects and the act of removing those defects from the system: debugging. A survey of San Jose ESC attendees in 2006 concluded that debugging is the most time-consuming and costly phase of software development, "with 63 percent of respondents citing debugging as the most significant problem they encounter, almost double any other single task."
Minimizing Software DefectsIn order to streamline the embedded development process, we must tackle the biggest problem facing software developers minimizing software defects. At every step in the development process we must look for opportunities to minimize the introduction of software defects and optimize the rate at which defects can be removed. Therefore we'll look at things that can be done in the design phase, opportunities to automate in various phases, and advanced tools for debugging. When all of these techniques are used together significant gains can be achieved in reducing software defects.
DesignKISS, KISS, KISS. Keep it simple, stupid! Minimizing complexity is perhaps the most important aspect of designing efficiently. And it's easier said than done. Most of the time when engineers sit down to design and put pencil to paper, the first thing that comes out is complex, convoluted, and confusing. Achieving simple and elegant designs typically takes many iterations and a concerted effort to eliminate anything that is not needed. Spending the extra cycles at this stage to analyze the design and look for ways to simplify will pay huge dividends in the long term. Minimal and elegant design leads to much greater maintainability and fewer software defects in the long run.
Realizing good componentization is another important aspect of this phase. When the design of a subsystem gets too complex, break the subsystem into easy-to-understand components. A good rule of thumb is that one engineer should completely understand every line of source for a single component and be responsible for it. Often times the underlying operating system can lend a hand in managing componentization and enforce rules about how various components interact with one another. A microkernel OS with memory protection and good separation capabilities enables developers to:
When a system is properly componentized, software defects that pop up later are easy to isolate and contain. Fixes for those defects come faster and don't negatively affect other components in the system. Furthermore, with a well defined message based API for the component, it is easy to develop complete test cases that are re-playable for regression testing.
AutomateWhile minimizing complexity provides benefits for long term maintainability and overall efficiency of the development process, there are many tasks in the everyday workflow that can be automated to immediately recognize gains in productivity and efficiency.
Debug like a ProThere is no one single debugging tool or technique that is a silver bullet for finding and eliminating software defects. While finding bugs automatically with the tools mentioned in the previous section is by far the most efficient method, automatic bug detection cannot identify all software defects. Finding the rest of the bugs efficiently is primarily about system visibility. If you can't see it, you can't find it. A complete set of debugging tools should include:
It is also imperative that the developers using these tools are experts at using them. It does no good to give someone a circular saw if they are going to try to use it to bang a nail into a piece of wood. Not only do you have to have the right tools at your disposal, but you have to know which one to use, and how to use it for the given situation.
SummaryLooking back over the recommendations I've provided, I don't really think there is anything earth shattering here, but I'm continually amazed by the number of software organizations that I've worked with that don't do some of the simple things outlined above. For example, I've lost track of the number of Linux developers that have told me the only debug tool they use is printf. These are people that work in world class companies developing applications that are millions of lines of code. And they are absolutely handicapped by their toolsets, like a carpenter without a hammer! So, while most of what I've put in here seems like common sense, actually walking the walk is another matter. How do you and your organization stack up?
Joe Fabbre is technical solutions manager for Green Hills Software, www.ghs.com, 800-765-4733.
|Managing Time to Market with Early Software Design Verification
Ken Karnofsky, senior strategist for signal processing applications at The MathWorks
In today's complex, algorithm-intensive wireless communications systems, verification is a major contributor to project delays and engineering costs. The current algorithm verification process is inefficient and creates opportunities to introduce errors. In a typical flow, designs start with algorithm developers, who pass the design to software and hardware teams using specification documents. Each team typically crafts its own manual test procedures to determine that the implementation is functionally correct according to their interpretation of the specifications.
Compounding this inefficiency is the use of separate tools and workflows for software, digital, and RF/analog hardware components, which inhibits cross-domain verification. And engineers often discover late in the development process that algorithms don't work as intended in the target environment.
It doesn't have to be this way. Many designers of algorithm-intensive systems already have the tools they need to get verification under control. By using these same tools for early verification with Model-Based Design, engineers can not only reduce verification time, but also improve the performance of their designs.
With early verification, the algorithm design and implementation teams use the same executable system model as their design reference and test bench. Automation interfaces between algorithm and system simulation tools and software and hardware development tools, enabling each team to reuse the test bench with minimal disruption. The result is a faster, error-free verification process that leverages each team's existing tools and workflow.
The cross-domain verification problem can be solved by pushing verification up to a higher level in the design flow. Tools for Model-Based Design provide multidomain modeling capabilities that enable "virtual integration" by simulating algorithms, software, digital hardware and analog together in one environment. This aspect of early verification lets system architects and component developers see how design decisions affect system-level behavior. It also helps designers catch integration problems early, while they are still easy to correct.
Using Model-Based Design, algorithm developers can apply their same tools for algorithm development and system simulation to rapidly prototyping their designs on hardware, without low-level programming. This early verification technique lets designers quickly prove the viability of new ideas and analyze performance under real-world conditions.
Leading communications, electronics, and semiconductor companies have used all of these early verification techniques to gain competitive advantages by simultaneously reducing their test and verification costs while strengthening their ability to develop innovative new products faster.