Monthly Archives: January 2014
Facebook Inc. said it plans to move this year to new networking equipment in its data centers and stop purchasing gear from conventional networking suppliers. The new gear will be based on standards that are being developed by the Open Compute Project, Jay Parikh, Facebook’s company’s vice president of infrastructure, told CIO Journal. Facebook’s decision could impact incumbent gear providers like Cisco Systems Inc.CSCO, especially if more companies follow suit, but those vendors say the new approaches lack sophistication.
SAN JOSE, Calif. – Facebook once contemplated building its own software to manage its massive data center infrastructure. But after a lengthy review of its options, the company has opted to use software from CA Technologies to track and manage its data center capacity. The announcement is a significant win for CA, which beat out a dozen companies for the high-profile deal. Facebook will use CA Data Center Infrastructure Management (DCIM) software to bring together millions of energy-related data points from physical and IT resources in its global data centers to improve power efficiency.
SAN JOSE, Calif. — Mark Zuckerberg was in his element. Mr. Zuckerberg, whose social network turns 10 years old next week, spoke Tuesday at a meeting of the Open Compute Project. Open Compute is an initiative that Facebook started three years ago to help big computing centers add the kind of cost cuts and efficiency gains that open-source software — where programmers share ideas and code across company, university and even national boundaries — to single computer servers and Web management.
At last year’s Open Compute Summit, Facebook VP of Engineering Jay Parikh suggested the company might consider Blu-ray discs as a medium for improving durability and cutting costs for its long-term cold storage efforts. Now, Parikh said Tuesday during a keynote at this year’s event, the company is actually doing it. Facebook has built a prototype Blu-ray system capable of storing 10,000 discs and 1 petabyte of data in a single cabinet, with plans to scale it to 5 petabytes per cabinet. Blu-ray storage would save the company 50 percent in costs and 80 percent in energy usage over its existing hard-disk-based cold storage methods, Parikh said, and will provide 50 years worth of durability for data stored on the discs.
Fusion-io, a company focused on flash storage in the data center, said its products will be integrated into Quanta Rackgo X systems, which are stripped down servers. The news comes out of the Open Compute Project (OCP) conference in San Jose. For Fusion-io, the Quanta deal means more distribution. The move also shows how white-box manufacturers, contractors that make servers and PCs for other companies, have become players in the data center. In other words, movements like OCP mean so-called “vanity free” servers are gaining ground.
If it sounds counterintuitive that software giant Microsoft is contributing its server specifications to the Open Compute Project, it shouldn’t. By doing so, the company hopes that big hardware makers will build servers just like the ones it runs in its huge data centers, and perhaps give it a more efficient supply chain.
SAN JOSE, Calif. – Over the last three years, Facebook has saved more than $1.2 billion by using Open Compute designs to streamline its data centers and servers, the company said today. Those massive gains savings are the result of hundreds of small improvements in design, architecture and process, write large across hundreds of thousands of servers.
The fifth summit of the Open Compute Project is happening on Tuesday, and Microsoft has revealed that it is the latest member to join the group, a Facebook-founded initiative that sees the company and its partners commit to developing and sharing designs for data center infrastructure. Bill Laing, Microsoft’s corporate vice president for cloud and enterprise, says in a blog post that Microsoft will contribute what it calls its “cloud server specification” to OCP.