Software

EXCLUSIVE INTERVIEW

CAST’s Marc Jones: For Fed’s Open Source, It’s Trust and Verify

CAST Software is a software analysis and measurement firm that uses an automated approach to capture and quantify the reliability, security, complexity and size of business applications. A main company objective is increasing software assurance around reliability and security of applications delivered to the U.S. government.

Marc Jones

Marc Jones participates in a recent panel presentation at OSSI. (Photo: John Farrell)

Part of its drive for better software assurance utilizes fact-based transparency into application development, sustainment and sourcing. This enables program management and acquisition leaders to drive down its sustainment cost and risk.

Among its client base is Army PEO-EIS, Air Force, SPAWAR, Military Health, and the Departments of Justice, Homeland Security, Veterans Affairs, and Health and Human Services, along with other defense, intelligence and civilian organizations.

As governments and businesses adopt open source software with increasing regularity, CAST applies the same analysis and measurement standards for assessing open source and proprietary software with its arsenal of commercial and open source tools. CAST is a commercial for-profit software company with a propriety IP.

In this exclusive interview, LinuxInsider talks to Marc Jones, national federal practice director for CAST, about the process of certifying the fitness of proprietary, commercial and community-sponsored open source software.

LinuxInsider: How is what CAST provides to software users different from other software testing procedures?

Marc Jones: The firm is very focused on the requirements for software analysis and measurement. We work with a number of large entities involving the private and public sectors, as well as the integrated community. We enable them to evaluate and monitor risk in a variety of ways in the delivery of their software applications.

As a software company, we have applications that look at source and evaluate it for a variety of risks. For instance, we look at cost of ownership, maintenance and sustainment, transition risk around vendor changes, or employee changes. We look at short-term risks around security, performance and reliability.

LI: How much of this software assessment involves open source?

Jones: We also work on open source applications that clients use. Depending on whom you speak to in the open source world or IT community, open source has a different context. In our sector, we deal primarily with the IT organization fielding mission or business systems. I deal primarily with government IT, whether it is DoD (Department of Defense) or the civilian sector.

LI: How active are military and government in using open source?

Jones: When we talk to folks in IT, they are users of open source in the delivery of their own solutions in many cases. Their vendors deliver to them products that often has open source stuff within it — for instance, all the major Java frameworks and a variety of all the other things that are out there.

Over the course of the last few years, we see government from a functional economic perspective starting to seek open source business solutions. This includes CRM. And in the health care organizations, we are starting to see interconnects that link various health care records and provider systems and things of that nature. So the federal government has become very dependent on open source products.

LI: Are there different procedures involving how open source is treated compared to commercial or proprietary software?

Jones: What is interesting to me in that context is the open source product is often not treated the same way as custom commercial code is in the sense that because it is open source, it is kind of dropped in. And in many cases it does not go through the same quality assurance and testing rigor that you might have with a software development process that is generating 100 percent new code.

LI: That view surprises me. I am always told by open source proponents that community code is more rigorously reviewed for quality and security. Is what you are saying disputing that claim?

Jones: When you talk about security with open source, it really comes down to controlling your supply chain. It is not until that application is under your full control — and it does not matter what the platform may be — and that code is in the configuration management system of the entity that is using it, that software is under any real control.

From our perspective, we are seeing a lot in the DoD assurance community that code is code, and everything needs to be verified. Just because that code happens to come from an open source website that has a community looking at it does not imply that by the time that code gets into your system it is reliable, resilient, etc. That open source code needs to be treated with the same rigor that you are starting to see custom applications assessed.

Let me make another point. Often the fallacy is that these open source projects stand alone. But they don’t. They are integrated with other things. For instance, take that health care interchange solution I was describing. You do not even know how secure it is until you start linking it to all of the different data feeds and sources that would be a deployed system. That is what would need to be verified ultimately.

LI: So are you saying that just open source is lacking in security verification?

Jones: I am saying that open source is not necessarily being dealt the same rigor, I think, that a custom code being dropped into a solution from supplier X might be.

LI: Is that kind of situation you describe a detriment to open source, and does it cause businesses to shy away from using open source because it does not measure up to commercial products in testing?

Jones: I wouldn’t say that. I think that there are positive aspects to open source. We did some work with a military health component where they actually ran our solution on open source products and fed those results back to the open source community, and saw big improvement in the open source product. So the community is responsive, I think, to feedback to drive quality if it is an active community, of course.

LI: At what point can an open source user feel confident that the community backing a particular project has a quality product?

But that being said, I think the larger comment is that sometimes new changes occur that create the perception of a panacea. The idea may be that because of the big community, somebody has already done that work for you. Ultimately, it is the business which is adopting that system that incurs the risk, not the open source development community.

That business has to manage the risk as if they were developing their own custom software. When you think in terms of someone buying a packaged application, that is a very big contract with lots of liability language in there. To put some heat on the supplier from the perspective of risk sharing, you may not see that at the same level in a number of the open source products.

LI: When CAST performs code testing, does the process randomly check applications and issue reports, or is the testing the result of a client’s request?

Jones: We look at system-level risk. So from our customer’s perspective, just looking at an open source application and just issuing a report on it does not mean very much. Having an open source product sitting on the community’s server saying it has been validated does not really mean very much. They want to see how it looks after it has been adapted and perhaps modified or extended in their own environment with the other elements it is linking to.

So at CAST one of our competencies is being able to run an analysis all the way from a data component to a mobile system, and everything in between. In that in between or at that mobile system or at that database level such as Postgres or something, we would look at all the code, not really identifying whether or not it was open source but looking at the risk of that whole system.

LI: Given your view that the risk in the open source communities varies, what needs to be done to make open source products more compliant or equal to proprietary and commercial products?

Jones: Buyers should make no assumptions either way, and apply their own due diligence and take responsibility for verifying what they are getting, and what they are putting into their environment is compliant with best practices. The only way you can assure that is to run the analyses yourself. You can not rely on a third party to self-assess and just give you the code that you take sight unseen. It is a trust and verify aspect.

LI: With the DoD getting more involved with using open source, how effective is that becoming in lowering IT costs?

Jones: There are a lot of different categories of open source. There are many open source tools that the government can bring in to assist development. There are plugs-ins for Java and .NET and things of that nature that are open source products the government can download and use to make their own code better when they are writing it.

They can use open source frameworks so they do not have to write their own — Hibernate, Spring, etc. These are all outstanding time savers. And the government leverages them like crazy. As you go up the chain of functionality, I think you see the government is consuming a ton of open source and is looking a lot more at leveraging the functional business applications in a big way.

LI: So just as a business enterprise would, how do government agencies test and verify open source software assurance?

Jones: We have been a big part of, on behalf of government, that process. They have used our system to review that source code. Code is code at the end of the day. So running an open source product or an integrated source with custom code is no different technically than anything else. I think it is with that kind of assurance you are starting to see more confidence in leveraging open source to take advantage of some of the cost benefits there.

LI: Can you give some examples of how the government is using open source projects rather than more expensive proprietary or commercial software?

Jones: Look at NOAA (National Weather Service), for instance, to see the open source database is Python and all kinds of stuff. They are hardly unique in that use. Even at CAST, we make use of open source to extend functionality.

Open source certainly is not going away. It is becoming more and more a part of the infrastructure. I think one takeaway is that ultimately, regardless of whether the open source code comes from a purely independent community or a federal integrator, open source should not get a free pass on verifying that it meets fundamental tests of mission or business worthiness. And conversely, the open source community should not feel threatened by that.

LI: So you do not see any shortcomings to open source in terms of security issues compared to commercial or proprietary alternatives?

Jones: If clients are feeding information back to the community to make that product better, more resilient and stronger, then all the stakeholders benefit from it. That transparency itself is very very interesting. But it does not guarantee you reliability or secure software. That still needs to be validated and verified as those systems get integrated, customized, enhanced and configured in the specific environments in which they are going to be deployed. There is no short cut.

Jack M. Germain has been writing about computer technology since the early days of the Apple II and the PC. He still has his original IBM PC-Jr and a few other legacy DOS and Windows boxes. He left shareware programs behind for the open source world of the Linux desktop. He runs several versions of Windows and Linux OSes and often cannot decide whether to grab his tablet, netbook or Android smartphone instead of using his desktop or laptop gear.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by Jack M. Germain
More in Software

What's your outlook for the business climate in 2025?
Loading ... Loading ...

LinuxInsider Channels