The diversity of operating systems and hardware in the datacenter is greater than ever before, while data software specifically designed to manage the different types of systems is becoming more widespread and popular.
With a firsthand view of the evolution of Linux and its increasing role in today’s diverse datacenter, Veritas Software claims the open-source platform has truly hit the mainstream enterprise based on the fact that Linux now handles database workloads.
A strong proponent of utility computing — a relatively new computing strategy in which IT resources such as storage, servers and applications flow like water — Veritas says even the most conservative companies now are coming around to Linux and more open-source software in the datacenter.
In an exclusive interview, LinuxInsider talked with Veritas director of Linux strategy Ranajit Nevatia to hear about these changes.
LinuxInsider: What are you seeing in terms of the shift from homogeneous IT environments to heterogeneous ones with both Windows and Linux?
Ranajit Nevatia: Homogeneous IT environments have often been a myth — and now, with the success of Linux, heterogeneous environments are becoming even more transparent. Enterprise users have always supported multiple platforms by design or often because of inorganic growth through acquisitions and expansions. Nonetheless, we’re still seeing companies converge toward dual-platform strategies.
In the past, these dual-platform strategies were predominately Unix-Windows. However, today, more often than not, Linux is the common platform of choice, whether it is Windows-Linux or Unix-Linux. Overall cost, flexibility and ease of customization are the strong-selling points of Linux in these mixed environments.
Our customers are definitely moving to Windows-Linux as well as Unix-Linux mixed environments. But mixed environments are nothing new in client-server environments. Platforms such as Windows, NT, Macintosh, Novell Netware, Linux and Unix, in tandem with Web-based self-service applications, e-mail and wireless devices like PDAs and cell phones, are all examples of how diverse operating system platforms, applications and technologies are working together in a mixed environment.
Linux is built on open source, making it easier than other more proprietary platforms to integrate with, giving unprecedented control to vendors and customers in Linux environments.
LinuxInsider: What are the biggest challenges associated with the diverse, multi-vendor environment in the datacenter these days?
Nevatia: One of the greatest attributes of multi-vendor environments is also its greatest challenge — namely diversity. Diversity translates into greater options and choices for solving business or IT problems, but it also has the potential for decreasing efficiency and flexibility. Diversity or “heterogeneity,” if not handled properly, can also create significant management issues, which can all lead to higher costs of ownership.
Diversity in a multi-vendor environment comes in many shapes and forms. From a platform-infrastructure perspective, it can mean various types of servers, disk arrays, tape libraries and so forth. From an application perspective, it can mean various types of Web servers, application servers and database servers. In all cases, the most common and the most critical aspect of any IT environment still continues to be data. Data is the most valuable asset for all organizations.
Whether it’s the desktop, workgroup or datacenter scenario, customers are seeing their data demands increase exponentially. They’re grappling with how to manage that data, how to virtualize it so it supports more users’ applications, how to store it, how to protect it in case disaster strikes, and how to ensure business continuity when it does strike.
Many Veritas customers use our full set of storage, data protection, high availability and performance management solutions precisely because Veritas provides a common set of management policy-based tools that they can apply consistently across any environment or OS platform.
LinuxInsider: You talk a lot about utility computing — please define it in your own words and explain whether its promise is being realized.
Nevatia: Utility computing is a computing model that delivers IT as a measurable service, aligned with business needs and capable of adapting to changing demands. Essentially, there are three functional building blocks that are necessary across storage, server and applications for IT to operate as a utility.
First, availability — data and applications must be “always on,” insulating the end user from a simple system failure right through to a complete site outage. Second, performance — the system must run with optimal speed and efficiency, from the application right down to the storage array. And third, automation — shared hardware resources must be efficiently managed, and the resources should adapt to changing business requirements without the need for administrator intervention.
The end goal is for customers to achieve a model of computing where applications are always available, performing at the desired rate and automated to reduce administrative costs, similar to the way a utility delivers water or electricity.
The biggest challenge for many aspiring IT utilities, as they evolve to a utility model of computing, is how to build in the necessary flexibility to accommodate multiple platforms and a variety of hardware devices from different vendors. Open, heterogeneous software can provide centralized visibility into these disparate resources, helping the IT utility bring the pieces together into a single holistic view and realize the full promise of utility computing.
LinuxInsider: How critical are Linux and other open-source software and standards to the idea of utility computing?
Nevatia: The vision of utility computing drives higher levels of virtualization of a datacenter environment. Only a truly virtualized IT infrastructure can deliver the kind of service that a water or electricity utility drives. Theoretically speaking, in such a scenario, the underlying OS and the hardware become irrelevant, making the platform a prime target for commoditization. This makes Linux very critical given that it is already driving towards the same goal.
Open systems and open-source standards will play a growing role in the evolution of utility computing. Doing more with less is what utility computing is all about — that is, achieving greater economies of scale. So, basically, customers are working to drive down the cost of IT, to optimize all their IT resources and increase the flexibility of their IT environment.
Linux is significant in a utility-computing model because it drives down or commoditizes the cost of both the hardware and OS, making the underlying infrastructure more cost-effective. So IT managers can now shift their focus from the infrastructure and OS layers to where the real intrinsic value lies, namely the software layer that allows them to simplify, optimize and expand the management of these commoditized environments.
Many Veritas customers are looking at ways they can use Linux in their enterprise environments — to do more with less. Which brings us back to the greater economies of scale that are inherent in a utility-computing model and ultimately deliver higher levels of service.
LinuxInsider: Do you sense Linux and other open-source deployments moving beyond pilot projects and experiments in mainstream corporations to take on more critical tasks or roles?
Nevatia: Absolutely. Major Fortune-500 Veritas customers are deploying Linux to handle business-critical applications and databases. Back in the last century, Linux did start with pilot projects in certain key verticals like finance and telco and with certain niche applications in research like HPTC or edge server apps like DNS, proxy, firewall, Web and so forth. However, this century it has had a much more broad appeal.
Today, Linux is definitely on the mainstream path. Leading IT vendors like Veritas, Oracle, IBM, HP, Intel and Novell are delivering enterprise-ready solutions for Linux and are investing heavily in its future development. It’s where the market is growing the fastest, and it’s the direction our customers want us to go.
Support from these vendors is enabling customers to deploy a much wider set of commercial applications. They say database workloads are the final frontiers for an OS to be mainstream in the enterprise. Well, guess what? This year is proving to be the year when many customers are moving their database workload over to Linux.
LinuxInsider: How far along are we on the path to that mainstream adoption?
Nevatia: Linux is already mainstream in some industries and applications. Linux perhaps commands the largest number of servers when it comes to deploying the edge-tier servers. It’s now spanning multiple industries and applications and further going deeper into the enterprise.
Besides the financial and the telecom sectors, we are now seeing strong traction across the board, in the healthcare, e-business, government and transportation sectors. Even insurance companies, which have traditionally been very conservative, are keenly evaluating prospects of Linux. We all know some of these companies are so conservative that one can argue they have missed the whole open-system movement and are still revolving around the mainframe world. Some customers now even want to run Linux on the mainframe.
LinuxInsider: Would you care to comment on some of the improvements in both the reality and perception of Linux security?
Nevatia: There are multiple aspects of security. We don’t claim to be the security experts — other companies have that expertise. That said, let’s reflect on two aspects of security: prevention and protection.
Prevention involves making sure your environment is not susceptible to external malicious agents taking advantage of intentional or unintentional holes. This requires you to have better understanding and visibility of your environment. Linux is built on open-source technology, which provides the visibility necessary for making sure there is no such vulnerability. No software is perfect, but being open source enables such shortcomings to be identified and fixed almost with the same ease.
Protection — more specifically, data protection — involves making data, the most valuable asset, as secure as possible. Veritas has built a business around providing the highest levels of functionality to secure customers’ data and providing business continuity. So we take a holistic view, regardless of the platform — whether it is Windows, Unix or Linux.
If users have diligently backed up and archived, then they can quickly recover critical business data — up to the instant just before disaster struck. If companies cluster and configure their enterprise applications and databases, they can then automatically failover, in the case of a site outage, to the next available server regardless of whether that server is next door or three states away.
LinuxInsider: What type of datacenter do you envision for the typical, security-minded, large corporation in the years ahead?
Nevatia: We envision a heterogeneous datacenter that functions more like an IT utility where data assets are distributed, yet highly consolidated, across multiple locations and tied together — virtualized — in a highly secure and efficient manner for optimum usability and overall system integrity.
Security management will become more centralized and automated to standardize and enforce security policies. Intrusion detection systems will become more heuristic in that the systems will intelligently seek out, monitor and automatically correct anomalies before they adversely impact the system.
Managing system security issues will be yet another capability enabled by provisioning. Datacenter administrators will be able to respond to security risks by creating or updating images of a standard “good” server OS and automatically rolling it out to multiple servers. In fact, Veritas customers can accomplish on-the-fly asset provisioning today using our OpForce product.