Reaching for the Clouds: Enabling tomorrow’s rapid connections, today

Credit: Skypixel |

IoT is generating massive amounts of data and this is forecast to continue, as more connected devices come online around the world. According to Gartner, the total number of IoT devices globally will reach 20.4 billion devices by 2020.

With the rise of next-generation connected devices, companies that make and sell connected devices will be looking for cloud solutions to manage the data. In Australia, Gartner noted that 2018 spending for data centre systems will grow to AUD$2.7b.

With IoT and smart devices driving increased consumer demand for seamless experiences, data and access to data is increasingly critical to business success. User demand is set to increase exponentially. From the relatively simple task of using a taxi booking service on a smartphone to the high intensity computational and storage capacity needed for artificial intelligence (AI) and facial recognition, consumers will expect more from their services and connectivity.  

Data centre operators and facility managers are under pressure to ensure their infrastructure has the capacity to meet future demand and velocity requirements. Besides increased data storage and exponential growth in user demand, there is growing recognition that businesses need to get as close to their customers as possible, to help avoid latency for bandwidth-intensive applications.

Effective data centre infrastructure planning enables data centre operators to enable rapid deployments for low- to high-density applications, offer usable density (as opposed to theoretical density), while deploying a highly scalable platform for simple migration to higher-speed technologies and applications.

Rapid deployments for low- to high-density applications

Data centres need to be optimised to support increases in network speed, capacity demand, and range of devices. To accommodate the higher capacity demands, future-ready infrastructure is moving towards flexible higher-density cabling solutions that will support seamless migration to 40G, 100G and 400G.

Credit: Supparsorn Wantarnagon |

A data centre’s proximity to other data centres will also be a greater factor of consideration. Multiple data centres located close together form an Availability Zone (AZ) over a wide-area network connection, where actual computing, storage, networking and database resources are hosted. Each AZ is linked to at least one other AZ within the same geographical area through low-latency links. While linked together, each AZ is individually powered and therefore minimises impact to other AZs should one fail. As the replication of services across multiple AZs helps decrease latency and protect resources, making use of at least two AZs in a region supports greater stability and availability of infrastructure.

Seeking assurance of a higher level of business continuity in the event of an outage or natural disaster, operators will increasingly opt for infrastructure solutions that utilise multiple AZs. A facility’s proximity to other data centres reduces latency between connections.

Another major infrastructure change underway is the move from the traditional 3-tier switching model towards a 2-tier spine-and-leaf architecture. Rising demand for network bandwidth is driving increasing adoption of 25G servers, which require higher speeds of 100G and above in the switch fabric. Spine-and-leaf architecture helps to facilitate faster movement of data across physical links in the network, significantly reducing latency when accessing data. Every spine switch is connected to every leaf switch, allowing for the easy deployment of additional cables when required due to its high density. Spine-and-leaf architecture is increasingly the networking architecture of choice for cloud providers as it is a massively scalable, future-ready infrastructure.

Taking it to the Edge

Edge computing is the decentralisation of data processing to the “edge” of the network rather than routing data all the way to the central cloud. This decentralised architecture reduces network latency during the data transfer process. Data no longer traverses across the network to the cloud for immediate processing, but instead is processed in the local edge data centre. This reduces network latency during the transfer of data to the device. The end result is a flatter network with less hops, enabling more predictable latency.

IT infrastructure is set to adopt edge computing to reduce backhaul traffic to the central repository, and conduct data analysis at a local device. This is especially impactful in situations where data is not time-sensitive and does not necessarily need to be sent across the network. Edge computing deployments are also ideal when IoT devices have low/poor connectivity and struggle to maintain a constant connection to the cloud. A good example of a basic edge computing device is a wearable health tracker which can continue to monitor local data such as heart rate and sleep patterns and does not need to be constantly connected to the cloud.

Credit: Ekkasit Keatsirikul |

Edge computing complements data centre capabilities as it splits the data processing load generated from the increased number of connected devices. Edge computing is an emerging concept and its relevance is dependent on how much data processing and hosting needs is outsourced. Data centre operators will have to figure out how to complement edge technology in their data centre infrastructure planning.

Resilience is non-negotiable

We’ve already seen the transceiver and active electronics vendor roadmaps to 400G as they look to satisfy consumer and business desire for more bandwidth. Next-Gen 5G networks, AI, Virtual & Augmented Reality (VR/AR) will all be key in supporting so many different customer interactions. This level of ubiquitous connectivity – from connected devices to domestic appliances to autonomous vehicles – will have a massive impact on data centres. Such hyper-connectivity will require data centres to meet increasing demand while maintaining the highest levels of reliability.

Deploying an optical fibre solution to help grow edge computing capacity  and efficient delivery to the pole with 5G, in conjunction with cloud computing, will help data centre operators assure resilience and future-ready their data centre operations, today.

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags CorningClive Hogg

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.
Show Comments

Most Popular Reviews

Latest Articles


PCW Evaluation Team

Cate Bacon

Aruba Instant On AP11D

The strength of the Aruba Instant On AP11D is that the design and feature set support the modern, flexible, and mobile way of working.

Dr Prabigya Shiwakoti

Aruba Instant On AP11D

Aruba backs the AP11D up with a two-year warranty and 24/7 phone support.

Tom Pope

Dynabook Portégé X30L-G

Ultimately this laptop has achieved everything I would hope for in a laptop for work, while fitting that into a form factor and weight that is remarkable.

Tom Sellers


This smart laptop was enjoyable to use and great to work on – creating content was super simple.

Lolita Wang


It really doesn’t get more “gaming laptop” than this.

Featured Content

Product Launch Showcase

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?