Archive for the ‘Static Analysis’ Category

  • The Evolution of Static Code Analysis – Part 1: The Early Years

    on May 17, 11 • by Todd Landry • with 2 Comments


    Our marketing people here at Klocwork like to see me racking up frequent flyer miles and expending CO2 at roadshows, conferences and tradeshows. Whenever I’m out speaking, I always like to gauge audience familiarity with Static Code Analysis. I’m happy to say that SCA knowledge has definitely increased over the years, but it is still not up to levels enjoyed by unit testing or integration testing. What I plan to do over the next three weeks is to provide you with a history lesson on how Static Code Analysis has evolved over the past few

    Read More »
  • Static analysis cures all ills?

    on Mar 17, 11 • by Alen Zukich • with No Comments

    There was a recent article from Mark Pitchford titled: “Think static analysis cures all ills? Think again.” Obviously being biased working here at Klocwork, I take a major exception to what Mark has to say. This article makes ridiculous claims. About the only thing Mark got right was that static analysis has been around for a long time. However it’s ludicrous to think that they’re the same as they were in the past. That’s like saying computers from decades ago are the same as today. The advancement has been huge for static analysis tools, especially in the last couple

    Read More »
  • All static analysis tools are not created equal

    on Mar 8, 11 • by Brendan Harrison • with No Comments

    Yes, it’s true (!) and as anyone in this space knows there is a huge difference between static analysis tools, their level of sophistication, and their approach to developer adoption. Gary McGraw & John Steven from Cigital describe their views on this topic including ‘5 pitfalls’ that customers should avoid when evaluating tools. These pitfalls mostly amount to the fact that analysis results across different tools, code bases, and tool operators can make results vary significantly, so be aware of this fact when conducting your benchmarking. Their overall recommendation: “The upshot? Use your own code instead

    Read More »
  • How smart companies roll out source code analysis tools

    on Aug 19, 10 • by Patti Murphy • with 1 Comment

    Want to get rolling with a Source Code Analysis (SCA) tool as efficiently as possible? “Do what the smart companies do,” says Mark Grice, a Klocwork Director and Manager of the International Reseller/Partner Network. In our last discussion, Grice outlined three best practices for SCA tools selection: involve your developers, limit your selection to market-leading tools, and identify a deadline. According to Grice, smart companies take those best practices and: Buy an introductory package and pick one development team that will deploy the SCA tool. Do an in-depth performance analysis after six months. Expand the

    Read More »
  • 0010 0000 or 0000 0010 which one are you?

    on Aug 10, 10 • by Eric Hollebone • with 4 Comments

    I love this quote by Carl Ek from  Code Integrity solutions: There are 0010 0000 kinds of people in the world: Those that understand the difference between Big Endian and Little Endian, and those that do not. Issues with Endianism and processor architecture ports are becoming more and more common these days as more desktop source code moves into different arenas.  Gone are the days when the 32-bit memory model or little-endian format dominate. Software changes are required to support the growth occurring not at the desktop, but in the server  and mobile platforms. Mobile devices especially have opened a

    Read More »
  • Measure value out of static analysis

    on Aug 3, 10 • by Alen Zukich • with 1 Comment

    I’ve talked about different metrics that are used to measure quality and the metrics that developers would use in practice.  But what about the tools themselves?  How are you measuring the value you are getting out of these tools? In terms of static analysis, one obvious measurement is simply the bug fixes you have made.  Most organizations have a number they use to define the cost savings for each bug.  Using some research data from IBM puts the cost of fixing a bug before a release at 40-50 times cheaper.   Fixing a bug after

    Read More »
  • Leveraging static analysis

    on May 12, 10 • by Alen Zukich • with 1 Comment

    In a previous post I discussed the process where we practice dogfooding.  This is the process of using Klocwork on Klocwork (KonK).  We started this program several years back with the hopes that we would learn some valuable lessons about usability, performance and anything else that would give us an edge.  The truth is that KonK has consistently allowed us to test our design assumptions early by allowing our own developers to use Klocwork as part of their development. One of the unexpected results was inadvertently uncovering data that further validated for us the importance

    Read More »
  • Static analysis for Ruby/Python

    on Jun 29, 09 • by Denis Sidorov • with 13 Comments

    As a developer of static analysis tool for mainstream statically-typed languages, like C++ and Java, I was wondering for quite a while about how well static analysis applies to dynamically-typed languages, like Ruby and Python. And recently, I have come across this interesting project on GitHub: Reek – Code smell detector for Ruby. Well, I suppose that’s just a fancy way to name a static analysis tool. What can Reek detect? It does not do heavyweight data/control flow analysis, so the list is not very exciting: Code Duplication – AFAIU, it’s not very accurate, ’cause

    Read More »
  • Parallel Lint

    on Jun 22, 09 • by Alen Zukich • with 2 Comments

    Interesting article on static analysis tools to help find concurrency issues.  These so called “Parallel Lint” tools are specific to finding these types of issues.  Overall there are some great discussions on certain tools, and it is always nice when Klocwork gets mentioned.  But my problem is with the categorization of these tools.  It always makes me feel sick every time someone puts Klocwork in the same category of “powerful static analysis” with JLint, C++Test, FXCop and my favorite PC-Lint. This article goes deeper into PC-Lint and what they are doing with deadlocks.  The author

    Read More »
  • False positives in modern static analyzers

    on May 22, 09 • by Alen Zukich • with 1 Comment

    In response to Jason’s post about false positives.  First of all there is a general misconception of false positives.  Modern static source code analysis tools have changed the game.  It is not the Lint tool of the past, a focus with deep inter-procedural technology has placed the requirement that static tools today produce more real issues than false reports. With that said, Jason is right, large code bases never running static analysis will produce a large number of issues no matter how accurate it is.  Even though static analysis tools do provide a number of

    Read More »
Scroll to top