The Evolution of Source Code Analysis – Part 2: The Early 21st Century

The Evolution of Source Code Analysis – Part 2: The Early 21st Century

on May 26, 11 • by Todd Landry • with 3 Comments

In my last post, I took us back in time to an era of bad fashion, questionable music, legendary television shows, and source code analysis tools that were made specifically for software developers. It was the 1970s. In this post, I fast forward to just after the turn of the century to discuss the next ...

Home » Static Analysis » The Evolution of Source Code Analysis – Part 2: The Early 21st Century

In my last post, I took us back in time to an era of bad fashion, questionable music, legendary television shows, and source code analysis tools that were made specifically for software developers. It was the 1970s. In this post, I fast forward to just after the turn of the century to discuss the next evolution of static analysis tools.

The Early 21st Century

Not long after we first viewed hairy-footed Hobbits on the silver screen, and the sham that was affectionately known as Y2K, a new generation of source code analysis tools emerged to cure the errors of the first-generation tools.

These new tools looked beyond the syntactical analysis of previous tools, and instead provided inter-procedural and data-flow analysis. Low hanging fruit was definitely not the target for these tools.

These new techniques were serious–finding complex defects that could impact code quality and security, and they did that while ensuring that the “noise” (i.e. false positive rate) was greatly reduced compared to the first-generation tools. In addition to local defects, they were now identifying resource management issues, security vulnerabilities, concurrency issues, and so on. These were serious defects that,  if left undetected and unfixed, had the potential for massive problems to the code stream.

In order to perform this much deeper analysis, a fundamental change in the analysis techniques had to occur. These engines needed an unfiltered view of the entire code stream, and so they became tightly integrated with the integration build process.

Umm, Houston, we have a problem. If the analysis takes place at integration build time, then that means the analysis is no longer being initiated by the developers. Source code analysis tools became centralized and moved into a more downstream process such as part of a code audit function.

Developers were now being told they created bugs well after they actually checked in the code. They had already moved onto something entirely different, so now bringing them these day-old, or week-old defects was certainly not the most productive use of their time. It is well documented that the earlier you find defects in your code, the more cost effective it is to fix them, so you can clearly see the problems with these second-generation tools.

If only there was a way to bring these second-generation analysis capabilities to the developer desktop. More about that in my next entry.

Related Posts

3 Responses to The Evolution of Source Code Analysis – Part 2: The Early 21st Century

  1. Rod Chapman says:

    Todd,
    While I can agree with you that MALPAS and SPADE are certainly “first generation” tools, I can’t
    agree regarding SPARK and QAC.

    While these products have been around along time, it certainly doesn’t mean they’ve
    stood still. QAC has come on in leaps and bounds recently…but I will leave it to someone
    from PRQA to offer more insight on that topic.

    I think you are also mis-categorizing SPARK, which has _always_ done very deep and efficicent
    analysis, but has used contracts on package and subprogram signatures to avoid the nasty problem
    of having to do inter-procedural analysis. This is not a fluke – we designed it that way!

    – Rod Chapman, SPARK Team, Altran Praxis

    – Rod Chapman, SPARK Team, Praxis

  2. Todd says:

    Thanks for the comment Rod. Please don’t take my omission of the 80′s and 90′s as anything more than me actually trying to forget the 80′s and 90′s…not the best of decades for so many reasons ;-)
    Anyways, I believe those tools still fall into the first generation category. Namely because of the type of checking…AST. They lack the deep inter-procedural analysis that you get from second generation tools.

  3. Rod Chapman says:

    Todd,
    By way of a short comment on this article – you seem to have missed out
    most the 1980s and 1990s! There certainly was lots of research and use of static
    analysis in that time, particularly in the safety-critical market over here in Europe.
    What about the SPADE, MALPAS and (subsequently) the SPARK languages
    and toolsets? The latter has been in use on The EuroFighter Typhoon since 1990
    and is still going strong. What about PRQA’s QAC? They’ve been in business
    since at least 1990 as well. Come to mention it – there’s MISRA C itself,
    which dates from about 2002 or thereabouts and gave rise to a whole
    generation of tools. I’m sure I could think of many more examples…
    – Rod Chapman, SPARK Team, Praxis

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Scroll to top