ITPV takes a look at the changing data center environment, with the coming of open standards based data centers, potentially freeing data center operators and businesses from vendor lock-in.
In some sense, data centers are the last bastion of proprietary, or vendor locked-in technologies in enterprise IT. Concepts of agile IT, elasticity and open, turned from buzzwords to de-facto adoptions over the last decade on the applications front. Now these concepts are being taken to the hardware side, right down to the storage, networking, and compute boxes with the coming of open data centers.
Taking agility to the data center
With the mainstreaming of the now ubiquitous cloud, businesses saw the benefits of agility and pay-as-you go. These were delivered by the cloud providers, who in-turn benefited from economies of scale. However, the underlying hardware itself, being proprietary and offering little agility and interoperability, hadn’t changed much in the data centers. For example, an application developed for a storage box from vendor A would not work on vendor B’s, at least not without significant modification. This would lead to inefficiencies in deployment, maintenance and increase in overall cost of ownership of the data center. An irony that the very hardware that enables software to be agile, was itself rigid.
While vendor lock-ins worked to the commercial interests of those specific vendors, it was plainly obvious to data center owners – businesses and cloud providers alike – that proprietary or vendor locked-in hardware was suboptimal in many ways. There was the obvious impediment to growth and agility, and with these came high costs. Then there was the need to employ specialists for each technology.
The changing applications scenario made the conventional data center even less appealing. Applications moved from ‘records keeping and minimal processing’ (such as finance, HR and inventory applications) to more compute heavy BI and analytics. Today’s businesses are looking at near-real-time analytics to analyze and predict customer behavior. This new application landscape calls for quicker provisioning of data center resources, making virtualization and XaaS imperative.
It was only a matter of time before people started thinking, “what if we software-define the whole thing?”. After all, servers are defined by software with virtualization, so it seems like a logical step to software- define the data center itself–the storage, compute and networking blocks, using commodity hardware.
What’s an open data center?
Traditionally, data centers were designed to be able to accommodate technologies from multiple different vendors. Usually, these vendor specific products were not interoperable.
Open data centers, in contrast, are built and operated on open standards. These standards are developed and maintained by industry bodies or a consortium of stakeholders. One such consortium commonly recognized as a standards setter, the Open Data Center Alliance (ODCA) explains on their website, “Open standards help ensure interoperability. Think of them as you would blue prints or specifications. These standards are not owned by any single entity, but rather groups of stakeholders. Like construction blueprints, open standards can be influenced, changed, and updated by stakeholders working together to reach agreement.”
Open standards can be achieved by ensuring interoperability, by say, making available open APIs. This allows anybody to write applications for that piece of hardware, or run the same software on hardware from two different manufacturers. It is important to note that open standards are not necessarily the same as open source – the underlying design and firmware of those devices need not necessarily be open source. What’s important is that the data center components are based on a standard that explicitly allows interoperability.
Data Center Trends 2017 –
Open is here to stay
All predictions for data center and IT infrastructure trends in 2017 include open data center as a major trend to watch. The seeds of what we today commonly call open data center were sown in 2010-11. Not just industry consortium, but even large individual Internet companies that badly needed a more optimized data center, too, worked on standards and published them. Notably, Facebook, with its Open Compute Project, and Google, with its Kubernetes, independently tried to define open data center standards. Then there’s the Open Data Center Alliance, a consortium, that has been actively promoting open standards for data center and cloud.
What more, the Chinese Internet companies, Alibaba and Baidu, along with others, have formed their own open standards project for data centers, called Project Scorpio.
These independent and joint efforts towards establishing open data centers are an indication that open standards is set to become de-facto.
Open source has become the key driver in the design, implementation and management of new generation data centers. With Facebook opening their designs for their server infrastructure, to Google open sourcing their Kubernetes container automation platform, there are several examples of how open source designs and technology are driving data center architectures today.
Neeraj Bhatia, Director – Partner, Alliances & Commercial Sales, Red Hat, says,“The use of open source to build and promote data centers is an increasing trend amongst organizations across the globe. As per a study by Statista, Open Source revenue is likely to grow to 57.3bn USD. More organizations today are deploying open solutions for data centers, to increase interoperability and ensure smooth, agile and efficient service delivery. With the increased use of cloud based infrastructure, there is even more demand for standardized applications and operating environments which are open source in nature.
Open data center benefits
The many benefits of an open data center versus a conventional closed, or vendor locked-in one, accrue from standardization. Multiple technologies that run on different non-re-deployable hardware have hindered scaling up. With standardization, data center owners now have agility not seen before in the history of IT.
The entire ecosystem of open source developers cannot be undermined. In the words of Neeraj, “With millions of developers contributing to bringing innovation and speed to operating environments, applications as well as the underlying infrastructure, open source also fosters collaboration within the community to help resolve issues and problems with speed. This increased innovation and speed helps drive simplicity of management within the data center with open tools that communicate on open standards that prevent a data center from being locked into any single vendor’s environment.”
“Businesses, for the longest time, had no choice but to run on vendor-locked technologies, as their application of choice required a particular platform. With the new cloud based approach for the creation and deployment of applications, that restriction no longer exists. This cost saving has fueled the revamp of their existing data centers to enable larger support of open, standardized infrastructure that is capable of supporting an agile, horizontally scaled application infrastructure that can now provide very high levels of availability without the huge costs associated with vendor-locked systems.”, adds Neeraj.
Standardization, lower cost of support, and increased choice all lead to lower cap-ex and op-ex with respect to data centers, thereby lowering the cost of IT.
With open data centers, especially in today’s cloud oriented world, organizations are now thinking in terms of standardized workloads running on open technologies. This has led to a standardization of the underlying infrastructure and the software required to manage it. This increase in overall efficiency has enabled the data center to now be able to support newer services to be provided to the business.
When asked to comment on which verticals are likely to benefit most with open data centers, Neeraj remarks, “An open environment benefits everybody! With more collaboration, reduced cost of management and faster innovation, open source is the way to go. Every single new-age company that is redefining new ways of doing business is using open source solutions to bring innovation to market faster.”
He goes on to elaborate that sectors that are driving the growth of open data centers are BFSI, telecom, IT, Social Media and e-Governance initiatives. “An open data center will free every single vertical from locked systems and high costs of deployment & management”, he adds.
Challenges for adoption
One of the biggest challenges in adoption of new technologies or frameworks has always been that of skilled employees. In the words of George Chacko, Principal Systems Engineer & Lead Technical Consultant, Brocade India, “Business must have IT to survive in a digitally enabled world. It is no surprise that the demand for IT talent is higher than ever. An apparent lack of IT talent is impeding the success of organizations that are currently expanding or wish to expand their onshore talent, or are new entrants to the domestic, onshore market. As a result of the IT shortage, organizations who are not using creative methods to attract and retain IT resources will be viewed as less competitive and their market growth will be inhibited.”
According to Red Hat, the primary challenge when it comes to open data centers is security. “It is important for organizations to fully develop an understanding of the data life cycle and measures that can provide them with proper data protection. While implementing an open data center, it is also important to have properly trained and skilled personnel handling the deployment. We have a Red Hat® Certified Architect (RHCA) with a data center concentration for providing employee skills with tasks common in an on-premise data center.”, says Neeraj (Red Hat).
Data center market in India
According to a recent study by IAMAI, the data center market in India is seeing good growth for the last few years due to the explosion of data through smartphones, social networking sites, ecommerce companies and government initiated projects. This has led to an increase in productivity, fewer outages and an agile data center infrastructure. This increase in overall efficiency has enabled the data center to now be able to support newer services to be provided to the business user to drive faster growth for the organization.
Neeraj says, “The data center market in India is poised to see exponential growth and as more organizations require to support such a fast moving environment. Therefore, the design considerations of openness, agility, efficiency and low cost for their IT infrastructure are imperative. We will see different sectors coming to the fore to reap the benefits of open data centers in India.”
New data centers don’t have legacy vendor locked-in issues or the question of RoI of investments already made (in conventional data centers), and therefore are best suited to implement open data center concepts.
Opportunities for channel partners
With cloud services becoming ever so common, the IT channel community should view themselves as, and act as, providers of services. A channels player should identify one’s strengths across horizontals, or in an industry vertical, and re-skill to package and deliver solutions tailored to a customer. This is a mindset and operational shift from the conventional reseller model, which is becoming quickly obsolete. Now, with the coming of open data centers, businesses will see a reduction in complexity around hardware, and the focus will increasingly shift to services. It is in this context that the channels will have to find niches that suit them and find relevance for themselves in a rapidly changing IT marketplace.
By Kailas Shastry