Facebook's big data plans include warehouses, faster analytics

An engineer reveals during an industry conference how the site is working to make its backend data processing more efficient

Facebook may treasure the data it has on its one billion-plus users for its advertising returns, but the analysis the site performs on that data is expected to continue to pose numerous challenges over the coming year, an engineer said.

The problems, which Facebook has been forced to grapple with "much sooner than the broader industry," include figuring out more efficient ways to process user behavior on the site, how to better access and consolidate different types of data across Facebook's multiple data centers, and devising new open source software systems to process that data, Ravi Murthy, who manages Facebook's analytics infrastructure, said Tuesday.

"Facebook is a data company, and the most obvious thing people think of on that front is ads targeting," he said at an industry conference in San Francisco, during a talk on Facebook's back-end infrastructure, data analytics and open source projects.

"But it goes deeper than this," he said.

One major area of behind-the-scenes work relates to Facebook's analytics infrastructure, which is designed to accelerate product development and improve the user experience through deep analysis of all the available data, whether it consists of the actions users take on the site like posting status updates or which applications they use within Facebook on different devices.

Facebook currently uses several different open source software systems known as Hadoop, Corona and Prism to process and analyze its data, which the company will focus on making faster and more efficient over the next six to twelve months, Murthy said.

Many of the company's challenges are tied to what Facebook refers to as its data warehouse, which combines data from multiple sources into a database where user activity can be analyzed in the aggregate, such as by giving a daily report on the number of photos that have been tagged in a specific country, or looking at how many users in a certain area have engaged with pages that were recommended to them.

The analysis is designed to optimize the user experiences and find out what users like and don't like, but it also is becoming more taxing as Facebook is able to access more and more data about its users, Murthy said. Currently, the Facebook warehouse takes in 500 terabytes of new data every day, or 500,000 gigabytes. The warehouse has grown nearly 4,000-times in size over the last four years, "way ahead of Facebook's user growth," Murthy said.

To deal with these issues, Facebook has developed its Prism software system, which is designed to perform key analysis functions across the company's data centers worldwide, and split up the analyses into "chunks," Murthy said. That way, performing an analysis on, say, some metric related to users' news feeds won't clog up the warehouse more generally.

"We're increasingly thinking about how to capture this data," he said.

The company is also working on a system that takes a completely different approach to query the warehouse to give a response time within a matter of seconds, Murthy said.

Another area Facebook is continually looking at improving is its "transactional infrastructure," which handles the more basic, day-to-day data processing of, say, likes, comments and status updates to keep the social network running smoothly. Some of the questions the company's engineers and analysts are looking at include figuring out how to forecast the actual growth in this type of data, and how much computing Facebook should really allot for it, Murthy said.

"Can we predict what it's going to be six months from now?" he said.

Meanwhile, Facebook is also involved in a long-term effort to make its physical servers more efficient. The company began its Open Compute Project in 2011, with the goal of designing modularized servers that give customers greater control over the networking, memory, power supplies and other components that go into their servers. It was expanded to incorporate ARM processors in January.

Zach Miners covers social networking, search and general technology news for IDG News Service. Follow Zach on Twitter at @zachminers. Zach's e-mail address is zach_miners@idg.com

Tags analyticsInternet-based applications and servicessocial networkinginternetsocial mediaFacebook

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.

Zach Miners

IDG News Service

Comments

Comments are now closed.

Most Popular Reviews

Follow Us

Best Deals on GoodGearGuide

Shopping.com

Latest News Articles

Resources

GGG Evaluation Team

Kathy Cassidy

STYLISTIC Q702

First impression on unpacking the Q702 test unit was the solid feel and clean, minimalist styling.

Anthony Grifoni

STYLISTIC Q572

For work use, Microsoft Word and Excel programs pre-installed on the device are adequate for preparing short documents.

Steph Mundell

LIFEBOOK UH574

The Fujitsu LifeBook UH574 allowed for great mobility without being obnoxiously heavy or clunky. Its twelve hours of battery life did not disappoint.

Andrew Mitsi

STYLISTIC Q702

The screen was particularly good. It is bright and visible from most angles, however heat is an issue, particularly around the Windows button on the front, and on the back where the battery housing is located.

Simon Harriott

STYLISTIC Q702

My first impression after unboxing the Q702 is that it is a nice looking unit. Styling is somewhat minimalist but very effective. The tablet part, once detached, has a nice weight, and no buttons or switches are located in awkward or intrusive positions.

Latest Jobs

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?