Published 24 Jan 2024

The need for speed: achieving high data rates from data centre to cloud

By, Pulsant
Share

Stay updated

Pulsant insights and best practices delivered to your inbox every month.

Having decided to embrace the opportunities of the cloud, be it a full migration, or developing a hybrid infrastructure, one of the first questions a business faces is how to transfer all its data into the cloud.

This question will evolve rapidly, as the business realises that not only must it continually transfer new data to the cloud, but that the amount of data is going to grow massively.

This is not just an issue of volume, but also an increased complexity and variety of data arriving and being generated or processed at a much higher velocity.

Gary Taylor, Senior Solutions Architect at Megaport, a Pulsant partner explains: “The cloud offers a wealth of opportunities for turning an ocean of data into actionable intelligence, especially if it is delivered into the cloud and processed quickly. Consequently, achieving high data rates between a data centre and the cloud has become a front line in securing a competitive advantage.”

To keep your balance, you must keep moving…

Even without the complexities of transforming the data (i.e., turning from one thing into another), transfer isn't as simple as it sounds. The myth of a giant, flawless transfer, where a business inputs files on one side and sees them arrive on the other side is exactly that – a myth.

The reality is that achieving successful, sustainable high data transfer rates, demands a little more thought. Especially as this requirement will typically be utilised as an ongoing, and growing, essential need.

Consequently, developing a Cloud Centre of Excellence (CCoE) that is responsible for ensuring this capability, from day one, has become a hot topic for many technology leaders.

The strategy must establish the administrative cost, time and decisions that will be necessary. There will be many differences of opinion and the need to secure approvals from diverse figureheads, as well as deal with unanticipated issues.

Secondly, this strategy must coordinate the people in your organisation, that will both execute these transfers, as well as take responsibility for them. The people who approve data transfer tools and architecture, have different concerns than the business stakeholders assessing the risks and opportunities of moving the data.

This consideration of people goes hand in hand with choosing the right architecture. The reality is that this will be equally defined by cost, the availability of resources and commitments to other project considerations that the team is handling.

It is only once these strategic elements are addressed that the challenging technical facets of achieving high data rates can be assessed.

LinkPool-1

Bandwidth and other issues

Top of that list of technical challenges is usually insufficient capacity for the speed of data transfer that the business wants.

Mark Lewis, Chief Marketing Officer at Pulsant explains: “High bandwidth and lower latency will be critical for building successful customer experiences, now and in the future, as edge-oriented applications proliferate, the demand for a higher data transfer rate increases.”

The temptation is simply to ‘buy more bandwidth’. However, over the Internet, this is not sustainable in the long term due to security complexity and scalability limits. High Data use cases continue to grow including the data aggregation opportunities of billions of available sensors within the Internet of Things or high data generators such as Machine Learning and AI.

Another concern is that the business does not have the luxury of moving datasets whilst they are not in active use. Whilst it is theoretically feasible to transfer large amounts of data whilst it is being used, it is far from desirable.

Likewise, when an asset is in transit, it is most at risk. Protecting and monitoring the data while it is in flight is a challenge that should not be underestimated. The lesson is to ensure the data transferred is actually secure and usable in its new home.

Taylor concludes: “High data rates from a data centre to the cloud depends on a strategy that recognises that the business case for such quick transfers will likely demand sustained, repeated transfers. It is not going to be a case of ‘one and done’. Time spent in planning at this early stage will return substantial dividends.”

Lewis adds: “Software defined networking offers businesses the opportunity to build global scale connectivity rapidly and flexibly. Our clients combine hybrid cloud, data centre interconnection and Megaport to build solutions that respond to fast evolving data needs.”

Interested in more insights on the evolving world of connectivity?
Subscribe for updates and stay tuned for the next blog in our series.