Resultados 1 a 3 de 3
  1. #1
    WHT-BR Top Member
    Data de Ingresso
    Dec 2010
    Posts
    15,049

    [EN] Project Natick: Microsoft's Underwater Data Center





    It all started in 2013 when Microsoft employee, Sean James, who served on a US Navy submarine submitted a ThinkWeek Paper. Norm Whitaker read the paper and built a team to explore the idea of placing computers or even entire datacenters in water. In late 2014, Microsoft kicked off Project Natick, a research project to determine the feasibility of subsea datacenters The rest is history.


    Frequently Asked Questions


    What is Project Natick?

    Project Natick is a Microsoft research project to manufacture and operate an underwater datacenter. The initial experimental prototype vessel, christened the Leona Philpot after a popular Xbox game character, was operated on the seafloor approximately one kilometer off the Pacific coast of the United States from August to November of 2015. Project Natick reflects Microsoft’s ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.

    Why Project Natick?

    • Cloud computing continues to grow in importance, both as a driver of economic growth and as a consumer of global resources.
    • Project Natick is focused on a cloud future that can help better serve customers in areas which are near large bodies of water (where nearly 50% of society resides). The vision of operating containerized datacenters offshore near major population centers anticipates a highly interactive future requiring data resources located close to users. Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment.


    What are the customer benefits of Project Natick?

    • Rapid provisioning: Ability to deploy a datacenter from start to finish in 90 days.
      Enables rapid response to market demand, quick deployment for natural disasters and special events such as World Cup.
    • Latency: Latency is how long it takes data to travel between its source and destination. Half of the world’s population lives within 200 km of the ocean so placing datacenters offshore increases the proximity of the datacenter to the population dramatically reducing latency and providing better responsiveness.


    How would a Natick datacenter impact the environment?

    We aspire to create a sustainable datacenter which leverages locally produced green energy, providing customers with additional options to meet their own sustainability requirements.

    • Natick datacenters are envisioned to be fully recycled. Made from recycled material which in turn is recycled at the end of life of the datacenter.
    • A Natick datacenter co-located with offshore renewable energy sources could be truly zero emission: no waste products, whether due to the power generation, computers, or human maintainers are emitted into the environment.
    • With the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly. We see this as an opportunity to field long-lived, resilient datacenters that operate “lights out” – nobody on site – with very high reliability for the entire life of the deployment, possibly as long as 10 years.
    • Natick datacenters consume no water for cooling or any other purpose.


    During our deployment of the Leona Philpot vessel, sea life in the local vicinity quickly adapted to the presence of the vessel.

    When will Natick datacenters be more widely available in a product?

    Project Natick is currently at the research stage. It’s still early days in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.

    How long is it designed to last down there?

    A Natick datacenter deployment is intended to last up to 5 years, which is the anticipated lifespan of the computers contained within. After each 5-year deployment cycle, the datacenter would be retrieved, reloaded with new computers, and redeployed. The target lifespan of a Natick datacenter is at least 20 years. After that, the datacenter is designed to be retrieved and recycled.

    What does the name Natick mean?

    Natick is a codename and carries no special meaning. It is a town in Massachusetts.



    The Natick Team: Eric Peterson, Spencer Fowers, Norm Whitaker, Ben Cutler, Jeff Kramer. (left to right)


    http://www.projectnatick.com/

  2. #2
    WHT-BR Top Member
    Data de Ingresso
    Dec 2010
    Posts
    15,049

    Microsoft Unit Dives Deep for a Data Center Solution

    Microsoft Plumbs Ocean’s Depths to Test Underwater Data Center



    The “Leona Philpot” prototype was deployed off the central coast of California on Aug. 10, 2015



    JOHN MARKOFF | NYT
    February 1, 2016 - page B1

    Taking a page from Jules Verne, researchers at Microsoft believe the future of data centers may be under the sea.

    Microsoft has tested a prototype of a self-contained data center that can operate hundreds of feet below the surface of the ocean, eliminating one of the technology industry’s most expensive problems: the air-conditioning bill.

    Today’s data centers, which power everything from streaming video to social networking and email, contain thousands of computer servers generating lots of heat. When there is too much heat, the servers crash.

    Putting the gear under cold ocean water could fix the problem. It may also answer the exponentially growing energy demands of the computing world because Microsoft is considering pairing the system either with a turbine or a tidal energy system to generate electricity.

    The effort, code-named Project Natick, might lead to strands of giant steel tubes linked by fiber optic cables placed on the seafloor. Another possibility would suspend containers shaped like jelly beans beneath the surface to capture the ocean current with turbines that generate electricity.

    “When I first heard about this I thought, ‘Water ... electricity, why would you do that?’ ” said Ben Cutler, a Microsoft computer designer who is one of the engineers who worked on the Project Natick system. “But as you think more about it, it actually makes a lot of sense.”

    Such a radical idea could run into stumbling blocks, including environmental concerns and unforeseen technical issues. But the Microsoft researchers believe that by mass producing the capsules, they could shorten the deployment time of new data centers from the two years it now takes on land to just 90 days, offering a huge cost advantage.

    The underwater server containers could also help make web services work faster. Much of the world’s population now lives in urban centers close to oceans but far away from data centers usually built in out-of-the-way places with lots of room. The ability to place computing power near users lowers the delay, or latency, people experience, which is a big issue for web users.

    “For years, the main cloud providers have been seeking sites around the world not only for green energy but which also take advantage of the environment,” said Larry Smarr, a physicist and scientific computing specialist who is director of the California Institute for Telecommunications and Information Technology at the University of California, San Diego.

    Driven by technologies as varied as digital entertainment and the rapid arrival of the so-called Internet of Things, the demand for centralized computing has been growing exponentially. Microsoft manages more than 100 data centers around the globe and is adding more at a rapid clip. The company has spent more than $15 billion on a global data center system that now provides more than 200 online services.

    In 2014, engineers in a branch of Microsoft Research known as New Experiences and Technologies, or NExT, began thinking about a novel approach to sharply speed up the process of adding new power to so-called cloud computing systems.

    “When you pull out your smartphone you think you’re using this miraculous little computer, but actually you’re using more than 100 computers out in this thing called the cloud,” said Peter Lee, corporate vice president for Microsoft Research and the NExT organization. “And then you multiply that by billions of people, and that’s just a huge amount of computing work.”

    The company recently completed a 105-day trial of a steel capsule — eight feet in diameter — that was placed 30 feet underwater in the Pacific Ocean off the Central California coast near San Luis Obispo. Controlled from offices here on the Microsoft campus, the trial proved more successful than expected.

    The researchers had worried about hardware failures and leaks. The underwater system was outfitted with 100 different sensors to measure pressure, humidity, motion and other conditions to better understand what it is like to operate in an environment where it is impossible to send a repairman in the middle of the night.

    The system held up. That led the engineers to extend the time of the experiment and to even run commercial data-processing projects from Microsoft’s Azure cloud computing service.

    The research group has started designing an underwater system that will be three times as large. It will be built in collaboration with a yet-to-be-chosen developer of an ocean-based alternative-energy system. The Microsoft engineers said they expected a new trial to begin next year, possibly near Florida or in Northern Europe, where there are extensive ocean energy projects underway.

    The first prototype, affectionately named Leona Philpot — a character in Microsoft’s Halo video game series — has been returned, partly covered with barnacles, to the company’s corporate campus here.

    It is a large white steel tube, covered with heat exchangers, with its ends sealed by metal plates and large bolts. Inside is a single data center computing rack that was bathed in pressurized nitrogen to efficiently remove heat from computing chips while the system was tested on the ocean floor.

    The idea for the underwater system came from a research paper written in 2014 by several Microsoft data center employees, including one with experience on a Navy submarine.

    Norman A. Whitaker, the managing director for special projects at Microsoft Research and the former deputy director at the Pentagon’s Defense Advanced Research Projects Agency, or Darpa, said the underwater server concept was an example of what scientists at Darpa called “refactoring,” or completely rethinking the way something has traditionally been accomplished.

    Even if putting a big computing tube underwater seems far-fetched, the project could lead to other innovations, he said. For example, the new undersea capsules are designed to be left in place without maintenance for as long as five years. That means the servers inside it have to be hardy enough to last that long without needing repairs.

    That would be a stretch for most servers, but they will have to improve in order to operate in the underwater capsule — something the Microsoft engineers say they are working on.

    They’re also rethinking the physical alignment of data centers. Right now, servers are put in racks so they can be maintained by humans. But when they do not need maintenance, many parts that are just there to aid human interaction can be removed, Mr. Whitaker said.

    “The idea with refactoring is that it tickles a whole bunch of things at the same time,” he said.

    In the first experiment, the Microsoft researchers said they studied the impact their computing containers might have on fragile underwater environments. They used acoustic sensors to determine if the spinning drives and fans inside the steel container could be heard in the surrounding water. What they found is that the clicking of the shrimp that swam next to the system drowned out any noise created by the container.

    One aspect of the project that has the most obvious potential is the harvest of electricity from the movement of seawater. This could mean that no new energy is added to the ocean and, as a result, there is no overall heating, the researchers asserted. In their early experiment the Microsoft engineers said they had measured an “extremely” small amount of local heating of the capsule.

    “We measured no heating of the marine environment beyond a few inches from the vessel,” Dr. Lee said.
    http://www.nytimes.com/2016/02/01/te...ta-center.html
    Última edição por 5ms; 01-02-2016 às 07:48.

  3. #3
    WHT-BR Top Member
    Data de Ingresso
    Dec 2010
    Posts
    15,049
    The vision of operating containerized datacenters offshore near major population centers anticipates a highly interactive future requiring data resources located close to users.
    Enquanto isso, no Brasil comemora-se o lançamento de mais um cabo submarino.

Permissões de Postagem

  • Você não pode iniciar novos tópicos
  • Você não pode enviar respostas
  • Você não pode enviar anexos
  • Você não pode editar suas mensagens
  •