Cisco released an index where it examined the data center and cloud traffic, through which it found all regions included in the study, Asia Pacific, Middle East and Africa, Western Europe, Central and Eastern Europe, Latin America and North America, to be fully ready for basic cloud applications.
With most being cloud ready, the company found that the transition to cloud services is “driving global cloud traffic at a growth rate that is twice as great as global data center traffic.”
Cisco predicted that cloud traffic is to grow 12-fold, increasing from 130 exabytes to 1.6 zettabytes annually by 2015. One zettabyte is about the same as streaming 22 trillion hours of music, according to Cisco.
In its first Global Cloud Index, Cisco looks at cloud and data center traffic trends and application adoption from 2010 to 2015. The report estimates, “global data center and cloud-based Internet Protocol traffic growth and trends.”
“Cloud and data center traffic is exploding,” said Suraj Shetty, vice president of product and solutions marketing, Cisco. “The result: greater data center virtualization and relevance of the network for cloud applications and the need to make sense of a dynamically evolving situation.”
The company found cloud to be the fastest growing part of data center traffic, making up an estimated 11 percent of data center traffic. While cloud is the fasted growing component, the bulk of the traffic is from data centers completing backups and replication processes that are not visible to end users.
The company estimated that 76 percent of data center traffic will remain within the data center by 2015 in the form of workload migration across virtual machines and background tasks.
Additionally, the company found, in terms of video-based services, that the average amount of data center traffic per hour during peak periods is expected to rise up to 2.5 times. The estimated increase of traffic calls for plans to ensure data centers, cloud and networks can manage the needed capacity, said the report.
Cisco shapes its index with modeling and analysis of various primary and secondary sources. The company said it examined more than 30 terabytes of data generated each month over the past year from a variety of global data centers. Charts demonstrating the finds can be found, here.