When the word “datacenter” comes to mind, images of an endless array of servers housed in a gigantic building usually come to mind. Many of today’s datacenters often have the size, personnel and power requirements to qualify as their own miniature cities. Even a typical on-site datacenter needs plenty of room and enough people to effectively tend to it.
Throughout the past few decades, this infrastructure was exactly what industry juggernauts like Google and Amazon needed to safely fulfill their unique needs. But what if it was possible to shrink it all down to a more manageable size? That’s the ultimate vision being put forth by some key players within the industry.
Taking the Datacenter in a New Direction
At the Open Compute Summit in San Jose, Vapor.io CEO Cole Crawford and Chief Architect Steven White unveiled their vision of a modern datacenter in which row after row of hardware servers give way to software-defined solutions under an open standard. The company’s Open Data Center Runtime Environment is one of the first of the Open Compute Foundation’s contributions to rely on the reciprocal license, allowing ODCRE’s open API to integrate with a multitude of legacy datacenter hardware.
In addition to open integration, ODCRE also turns the modern-day hardware rack system on its head with a unique circular stack. The new rack mount system gives standard equipment users a better way of consolidate their infrastructure and maximize space and power usage. The company hopes this will give enterprises in smaller urban spaces the opportunity to enjoy powerful datacenter technology in a smaller form factor.
With this latest technology, Vapor.io aims to attain a power usage effectiveness (PUE) ratio of at least 1.1 – a far cry from the current industry average of 1.9. This focus on power consumption isn’t surprising considering how datacenters consume between 1.7 and 2.2 percent of total electricity generated in the U.S., with much of that power consumption dedicated towards cooling.
Expanding Beyond the Digital Frontier
Smartphones and tablets have proven to be the big driver behind today’s ever-increasing global dataflow. Thanks to their incredible proliferation across the globe, the total amount of data processed each year will eventually grow to 40 zettabytes by 2020.
This leaves the centralized datacenter of today in a bit of a bind. It’s already a challenging task for current datacenter technology to adequately provide edge service and large datacenters saddled with power consumption challenges may find themselves lacking the computing power to get the job done.
Building a more effective datacenter means finally tackling how cloud computing and its many software-driven technologies (such as virtualization and the “as-a-service” phenomenon) fit into the picture. It’s a conversation that’s long been in the making and it needs to be discussed if the datacenter is to fit within tomorrow’s data narrative.