Probably few in the mainstream security community noticed the announcement of the first round of controllers certified by Wurldtech. Yes, follow these links before you read any further, because none of this will make sense if you don't.
Although out of the SCADA Security world, this is something of personal interest to me not only because Eric Byres (who started but did not finish this effort, although you won't see his name anywhere) and I discussed quite a lot over the years (and something I presented on last year at PCSF and earlier at CanSecWest/01) but because the problem space something was one that I was intimately involved with at Cisco.
Over the years, I was involved in 3 or 4 different initiatives of varying scope and ultimate effectiveness that touched on the problem of defining minimal security standards/criteria/test procedures for network devices and protocols. See, I told you if you didn't follow the links you'd be lost or bored
While a lot of folks find this sort of "methodology work" tedious or tiresome, I have always found it fascinating. Although, truth be told, much of the point of defining (and automating the testing of) minimum product security standards was to offload soon-to-be boring work so that others can do it. Running port scans, testing TCP/IP/UDP/ICMP stacks for protocol implementation flaws (i.e. running ISIC for days and days, even if it gets you a CNN moment), checking for weak, default credentials, running commercial test suites like Codenomicon, tracking down all the versions of software and searching public vulnerability databases for relevant flaws in Open Source or commercial software components all should be done. I'd just rather someone else did it. And if I put my small company research hat back on, the sort of repetitive tasks such as this aren't terribly interesting from a project standpoint either.
However, defining and automating test cases or assessment criteria was interesting. But I'm probably in the minority since I've found "hard core" security folks to be the folks most resistant to the definition and enforcement of minimum security standards. You'll see this almost any time a new standard comes out. The standard is worthless, because it doesn't go far enough. It doesn't have/do X, Y, or Z, so it will lure users/vendors/whoever into complacency.
I saw this firsthand during my last year at Cisco. There was division within our group on how to approach a new effort that was being proposed by another [competing??] product security group who had in mind something very similar to Wurldtech's L1-4 test cases. The intent was to stamp out the "low hanging fruit" vulnerabilities which unfortunately all to common. (You know the sort of stuff that is many of the SCADA vulnerabilities being discovered) The most senior engineer in the group was putting up the classic "false sense of security" argument. (Another unconscious argument often put forth is that finding vulnerabilities is more art than science and can't be captured in a process or methodology, but I don't think that was in play)
I ultimately sided with the minimum standards effort for pragmatic as well as political reasons, since I knew the other group's manager (who I'd worked for previously) was far more more savvy than our current boss. We'd lose the fight even if it were the wrong thing to do--which it wasn't. Our group went on to drive the effort and it was one of the more important thinks I worked on in STAT and the Nerd Lunch I did out in San Jose was received far better than the one I did on BGP.
So what are the takeaways? Sometimes, the risk is greater for setting the bar too high vs. picking something that is good enough. The perfect is commonly the enemy of the good. There are always too many excuses not to do something now that solves part of the problem vs. endlessly working on something that might solve the entire problem in the future.
And I think this is one of the reasons why we see the first security certification of this type in the parochial world of SCADA and not somewhere else.