April 29, 2011

In Combining Dynamic Testing and Static Verification – Part 2, we discussed the differences in static analysis techniques. In Part 3, we discuss the synergies between static verification and dynamic testing and the VectorCAST integration with Polyspace. There are some areas of overlap. For instance, you can still find a runtime error using dynamic testing with specific test case data – but you then need to use the specific combination of values that will trigger the error. This is useful when testing the robustness of a particular function by using the maximum or minimum type values as input. But static verification would be able to accelerate the discovery of such bugs by... Continue reading »

Posted by Michael Rielly in Coding Standards

April 27, 2011

In Combining Dynamic Testing and Static Verification - Part 1, we discussed the differences between dynamic testing and static analysis. In Part 2, we discuss the differences in static analysis techniques and combining dynamic testing and static verification. What is static verification? Among the different static analysis techniques, static verification stands out for its ability to exhaustively detect runtime errors such as the ones mentioned previously. This technique accomplishes this by running different sets of algorithms to determine the realistic range of values that can be experienced on specific variables at different points in the program. These... Continue reading »

Posted by Michael Rielly in Coding Standards

April 25, 2011

This week's three part blog post will outline how to leverage the benefits of dynamic testing and static verification. The salient features of both software testing techniques are presented and how they are best implemented in automated tools. It will explain how both of these techniques complement each other and how to leverage the synergies between them. Lastly, we present the VectorCAST integration with MathWorks® Polyspace®. Dynamic Testing and Static Analysis The differences between dynamic testing and static analysis at large can be confusing. This is to be expected, after all, both techniques (not to mention tool vendors!) promise similar benefits.... Continue reading »

Posted by Michael Rielly in Coding Standards

April 20, 2011

I took my traveling User Group meeting to another long-time customer in Connecticut last week. There were 11 people in attendance, many of whom have been using VectorCAST for over 10 years! It is always enjoyable to talk to happy customers and get their feedback on what they would like to see improved in the product. Some of the users have already upgraded to our 5.2 release and had positive things to say about the improvements we have made to the GUI. If you haven’t seen a demo of the new 5.2 features, check out the VectorCAST 5.2 Overview video. If your company is interested in scheduling a mini VectorCAST User Group at your location, please let us know!

Posted by Michael Rielly in VectorCAST In Action

April 18, 2011

When developing complex embedded software, such as avionics software, medical device software and military software, analysis and testing are crucial for ensuring the safety and reliability of the final product. Two processes are often used to detect errors and ensure they are repaired - static analysis and dynamic testing. Some view the two methods of detecting defects as competitors. In a recent EETimes report, however, engineering expert Paul Anderson claimed the two processes complement one another. According to Anderson, static analysis can be effectively used for "bug hunting". Good static analysis tools are "simultaneously... Continue reading »

Posted by Michael Rielly in Avionics, Medical Devices, Military, News

April 15, 2011

As the marketing guy here at Vector Software, I spend a lot of time communicating with customers. I work with them on press releases, case studies, and testimonials. Unlike some other companies, our office doesn't have hand-crafted cherry wood desks, Herman Miller executive chairs, or expensive wall art. No, the walls here at Vector Software World Headquarters are decorated with customer testimonials. Each time we get a testimonial from one of our customers, we hang it on the wall. We usually hang up one or two a month. However today, we noticed that we may be running out of wall space soon. Here are the ones to go up this week from MDS, Endress+Hauser, Fujitsu, and Boston Scientific.

Posted by Michael Rielly in Journal Entry

Pages