Contact Us

The U.S. Needs 4,000 More Data Centers In The Next Two Years

The big picture when it comes to data centers, according to the speakers at Bisnow's Data Center Investment Conference & Expo South, is very big indeed. 

Increasing connectivity and other factors mean that roughly 600 zettabytes of new data is created each year (that is 600 trillion gigabytes), which is about 200% more traffic than current data centers can handle. To keep pace with the growth of data, the industry is going to have to build about 4,000 new facilities — on average 200K SF, 25 megawatt facilities — by 2020, the speakers said. 

University of Arizona Director Jim Jindrick, QTS Vice President, Innovation David McCall, Momenta Partners Strategy Partner Jim Fletcher, Microsoft Chief Stategic Advisor Dave Crowley, Affinitas Life Chief Operating Officer and Chief Technology Officer Wayne Sadin and Skybox Datacenters Vice President Gordon Kellerman, who moderated.

"What we're seeing is a transition, the beginning of an S curve of growth for the data center industry," Microsoft Chief Technical Advisor Dave Crowley said.

Crowley, who oversees network infrastructure (subsea cable, dark fiber, intelligent edge, 5G), content distribution networks and data center relations and design, said thinking about the data center business 10 or 15 years out is a challenge, but there are reasons to be optimistic.

"Unlike the internet bubble around 2000 — which involved ideas that had yet to generate any revenue — the cloud, with upward of $4 trillion in market cap, is a real business with real opportunities," Crowley said. "This isn't a bubble."

Even five years in the future is hard to visualize, University of Arizona Director, Corporate Engagement-Office of Research, Discovery & Innovation Jim Jindrick said.

"Our role as a university is to say, what else — how else — could we do things?" Jindrick said. "What new inventions could really turn the world on its edge?"

Affinitas Life Chief Operating Officer and Chief Technology Officer Wayne Sadin said he divides the world into hyperscale cloud — Google, Amazon, Microsoft, IBM, Oracle — and everything else called cloud.

"When I work with a hyperscale provider, it changes the way I can do business," Sadin said. "I can change how I innovate in a way that's hard for someone racking and stacking equipment."

Sadin offered an example of the monumental changes in computing just in recent years. "When I did an AI project about 10 years ago, before I did a test case, I had a quarter-million dollars worth of hardware, and a million-dollar contract with an AI company, and six weeks to get the material in, rack and stack it, tune it and configure it and run my first transaction. 

"A few weeks ago, I wanted to run AI on Azure. We clicked a button, and it would have cost $4, except we got 30 days free. That's just one example of this new environment."

The cloud accounts for only 5% of the $3.7 trillion spent in IT around the world, Sadin said. But not for long. "One day soon, 95% will be spent in the cloud. The question then becomes, what does that cloud look like?"

Connectivity is driving vast changes. "I live for the day when my microwave talks to my fridge, and they then call my car and tell it what to pick up on the way home," QTS Vice President, Innovation David McCall said.

Momenta Partners Strategy Partner Jim Fletcher, who recently served as chief technology officer for the IBM Watson Internet of Things Platform, said the edge isn't a physical thing, but a logical one. "When we look at it longer term, it might be 95% edge and 5% cloud," he said. 

Also, Fletcher said, IoT is about changing business models. "Companies can change, or they can become irrelevant," he said.

"IoT is allowing us to collect more data points than before at a greater frequency than before. AI needs to take that amount of data and turn it into actionable insight. We need predictive analytics that anticipates problems, and takes action ahead of time."