By KEAO CAINDEC
365 Data Centers


The public cloud is not for everyone and that fact is bringing an end to the public cloud’s reign as we know it. The public cloud’s massive growth and popularity was a bubble that was built upon public Infrastructure-as-a-Service (IaaS) being promoted as the ultimate IT solution for most, if not all, companies.

Unfortunately, for the proponents of public cloud omnipotence, the demand for undifferentiated or unsupported IaaS seems to be waning, with cloud-savvy businesses seeking a combination of managed cloud services, private cloud or colocation rather than pure public cloud solutions.

For evidence, look no further than the price wars between the behemoths of the industry as the commoditization of public cloud services reflects buyers’ unwillingness to pay extra for undifferentiated services. Or look at Rackspace abandoning private and public cloud in favor of managed cloud.

The Need for an Alternative Approach

As businesses ventured into the public cloud with high hopes that it would solve all their IT woes – and do so at a considerably low cost – they soon discovered that mission critical functions that require high availability and performance were more successfully run via a hybrid model that includes private cloud or colocation. These realizations have led some organizations to return to, or augment, their public cloud deployments with private cloud or colocation for certain applications that require more performance, control, management or security. The reality is that companies need a hybrid solution to ease into the public cloud, or back out of it, while still controlling the mission critical parts of IT and applications that are most important to them and their clients. There are many reasons why a hybrid approach is more practical for a variety companies.

Protecting Precious Possessions

A company’s most valuable asset is often its data. But data is a particularly vulnerable resource, always at risk of possible loss or corruption via attack, natural disasters, theft, power outages and malware. Relinquishing control over data and IT resources can be a major downside to using the public cloud. As such, data security often is one of the presiding questions for IT managers when companies consider alternative data-storage options.

In the public cloud, the infrastructure is out of the company’s hands and in many cases, geographically far away. Getting in touch with someone in customer care that is willing and capable of addressing technical problems in a timely manner has proven to be another concern.

With colocation, the customer retains control over the operation and maintenance of their servers and hardware while the data center provides the required infrastructure including space, power, bandwidth, cooling, security, and redundant systems. It gives companies increased control over the environment while delivering the benefits of the cloud and the added services they need.

Many data centers offer managed services that can be leveraged to monitor and manage IT infrastructure from the IT manager’s desktop. Managed colocation is offered as a complete turnkey solution where the customer owns the hardware but the data center manages all aspects of its operation, from updates and systems integration to troubleshooting. The key benefit is that the most skilled individual is managing the different aspects of the IT infrastructure and that it is locally managed by people who are accessible in case there is an urgent issue that needs immediate attention. Issues such as downtime that may be perceived as a small blip by a large and distant IaaS provider could have disastrous consequences for businesses that rely on those services.

Growing Impact of Proximity

For the most part, public cloud IaaS was built on the model of having extremely large, centralized facilities that take advantage of economies of scale to make for easier management by the IaaS provider as well as improved profitability for them.

For this reason, public cloud facilities are generally located in cities with low cost access to space and power. However, mission critical applications that require storage and data access need extremely low latency to function and integrate properly with other systems. This typically means that the data storage and infrastructure need to be physically located close to one another. Beyond 100 miles of distance, even a fiber optic connection will have too much latency to deliver the required throughput to properly run certain applications.

Proximity becomes even more important with the advent of new technology at the network edge that will deliver far higher bandwidth to businesses and consumers. Whether it’s mobile’s 4G LTE, high-speed cable or fiber to the home, network throughput at the edge will be increasing by about 10 times in the not-so-distant future. Delivering a good subscriber experience for content or cloud services at the edge will become increasingly difficult from geographically distant locations.

Further, the economics of transporting 10 times more data at the edge could easily mean several times that amount will need to traverse the backbone and it will no longer make economic sense for carriers to support this model. Current peering agreements will likely break down under the strain. In this case, the benefit of improved local service, security, control and management of your data and cloud assets will pale in comparison to the cold hard reality faced by carriers—they will simply not be able to support massive increases in traffic across their backbones.

What Does it All Mean?

Taken together, these trends mean that a more distributed architecture will be necessary going forward. Former so-called Tier 2 and Tier 3 metro areas, cloud and data center facilities will become increasingly important as the edge transforms and as local content or service delivery becomes necessary.

For the behemoths, the consumers of public cloud IaaS will surely appreciate the price reductions. However, lower prices do not always equate with better value. For businesses running mission critical applications that require always-on uptime and high performance to deliver what their clients demand, a local, hybrid cloud and colocation architecture will ultimately be the best choice.
http://www.datacenterknowledge.com/a...-clouds-reign/