Contact Us

Solutions For Data Speeds Become Major Focus For Data Centers, Networks

Let's talk about latency. Let's talk about IoT. Let's talk about all the data flowing crazily in between. Because the time it takes for machines to receive instructions over the internet is going to become increasingly critical going forward.

Big data is fighting to shorten the time it takes for you to get data on your device.

Latency is the big data term for the amount of time a piece of information travels from a data center point to a user's device. It is measured not in seconds, but fractions of a fraction of a second, with pushes to reduce it down to nothing.

In today's world, latency is just one factor in building a data center, alongside computing power, cooling technologies and capacity. But as the world increasingly becomes connected, not just via smartphones and computers, but with cars and appliances and even medical devices, the speed of data becomes ever more critical, said Jacob Smith, the co-founder of the cloud computing company Packet.

“Putting your data base in the cloud, that is one thing,” Smith said. "Putting autonomous vehicles [on the road] or rockets to the moon, that's another."

Transwestern Executive Managing Director Clark Dean moderates a DICE 2019 panel with CyrusOne Vice President Lindsey Bruner, City of Atlanta Chief Technology Officer Tye Hayes, INAP Senior Vice President Mary Jane Horne, DC Blox CEO Jeff Uphues and Flexential Regional Vice President Tim Langan

Even today, latency matters, although that need for speed depends on what the user is doing with it. Delivery of data at less than five milliseconds could be overkill for, say, a phone app that looks up recipes. But for trading stocks, a delay by even a millisecond could translate into lost money, INAP Senior Vice President Mary Jane Horne said. Horne was part of a lineup of industry experts during Bisnow's Data Center Investment Conference and Expo in Atlanta last week.

Latency also matters in more consumer-driven industries like gaming. Voice data is sent separately from the graphics a player sees on the screen. If there's a delay in a player barking instructions to teammates playing, say, Fortnite, that would be intolerable to them, Horne said.

“So you're already dead before the guy told you to turn around. You can't have that,” she said.

Latency is the difference between when a person expects something to happen and when it actually does, Exit Technologies founder Jeff Bittner wrote in an email.

"When we click a URL in our web browser, we expect to see it instantly. It is that dreaded latency that prolongs the page to load,” he said. "One of the reasons young people prefer texting over email is the reduced latency in communicating. Texting is almost instant, while [the] ... delivery method of email can equate to long wait times.”

Latency can be subjective in many ways, and is affected by a multitude of factors: physical distance, the number of routers data has to bounce around, the processing power of the device itself, the capacity of the fiber, etc. Each little factor can lengthen latency.

Not too long in the distant future, more devices will demand faster delivery, especially as artificial intelligence and augmented reality infuse themselves into everything from transportation to surgeries.

Turner Construction Vice President Pete Kangas moderates a panel at DICE 2019 with EdgeConneX Vice President Greg Carender, QTS Vice President Alan French, Ascent Data Centers Vice President Brad Pauley, Travelport Data Center Manager Tim Cooper and Tindall Corp. CEO Greg Force

“If it's an IoT device for smart homes, consumers usually won't notice a few milliseconds of delay and won't think too much about it,” founder Jamie Cambell, a noted cybersecurity expert, wrote in an email. “[But] some industries like medical or transportation require fast response times because even one millisecond could mean the difference between life and death.”

To accomplish this, many data center providers are beginning to follow a mindset from the physical logistics industry. As online retailers like Amazon promise faster and faster delivery of things like clothes and groceries to a shopper's home, that is requiring distribution nodes closer to big population centers. That is also happening with data centers, EdgeConneX Vice President Greg Carender said.

About a decade ago, 100 milliseconds of latency on average was common in Atlanta, according to Carender.

“Now it's less than five,” he said. “If we start getting sub-one milliseconds for latency … You're going to need to be in downtown.”

Right now much of the East Coast is served by Ashburn, Virginia's vast array of data centers. That is adequate enough for most apps and software and cloud users now. But, depending on how many data centers and routers the data has to go through to get from Ashburn to Atlanta, as lower latency is required, distant data center clusters like Ashburn may not be the solution in the future, Carender said.

“What is it going to be 10 years from now? We can't tell you what it's going to be, but we know it's moving down,” he said.

Latency is why 5G networks are being championed as the next frontier, especially in an IoT world. Today average latency in major cities is between 15 and 30 milliseconds, Bittner wrote. On the 4G network, latency can be as long as 70 milliseconds — dinosaur travel in today's world. With 5G, that maximum is only eight milliseconds, with a goal of hitting one millisecond, he wrote.

“If you think about the large hyperscalers out there … when you think of them, they're all creating applications today or creating their platform and connecting them to enterprise solutions that have to be one millisecond round trip,” DC Blox CEO Jeff Uphues said. “I have no idea what they're going to be doing five years from now.”

This quest for lower latency is already exacerbating the haves and have-nots when it comes to the internet in urban versus rural America. Data center operators want to be in areas where the connectivity is already robust, and that is most often near or in major metropolitan centers.

“The network matters,” Uphues said. "You're not going to build a data center in a place that doesn't have a full, robust internet connectivity."

This gets to what is called the edge in data center parlance: having equipment and nodes that can provide better speeds closer to users. And the more people gather in a place, the more data center providers will attempt to get closer to them, Horne said. The farther out from major metro areas, the worse the network may act in delivering information, which contributes to latency.

“You're talking about the edge. But your edge depending on your industry is only as good as your downstream network,” Horne said. "And that downstream network could be a much higher latency. You've got to get closer to where that particular customer has to process that need."

Today, that may mean a gamer in rural America may be at a disadvantage to a friend who is playing online with him, but lives in Atlanta.

“He's going to always be slow, he's going to hear directions too late,” Horne said. "He's not going to be a productive member of the team. It's sad, but that's the way it goes."