‘DeepFace’ could provide instantaneous facial recognition via Facebook

19 Mar, 2014 03:48 / Updated 11 years ago

Along with owning the largest stockpile of photos in the world, Facebook has announced it now plans to unleash facial recognition technology with a new program that promises to identify the subject of an untagged image with nearly unparalleled accuracy.

Researchers at the social media giant claim that humans who look at two faces can identify if they are the same person with a 97.53 percent accuracy. They promise that the company’s new “DeepFace” program will be able to do the same with 97.25 percent accuracy.

Facebook users may have already noticed that the site is able to suggest friends to tag when a new picture is uploaded. It does so by analyzing the distance between an individual's eyes and nose in both profile pictures and already tagged images.

The new DeepFace program will be much more intensive, using software to correct the angle of a face in an image, then comparing that to a 3D model of an average face. It then simulates what has been called a neural network to find a numerical description of the face. If there are enough similarities, Facebook will know if the faces are in fact the same.

DeepFace was developed by Facebook artificial intelligence (AI) analysts Yaniv Taigman, Ming Yang, and Marc' Aurelioa Ranzato, along with Lior Wolf, a faculty member at Tel Aviv University in Israel. Their research paper was first published last week in the Massachusetts Institute of Technology's Technology Review.

This deep network involves more than 120 million parameters using several locally connected layers without weight sharing, rather than the standard convolutional layers,” the company announced.

This we trained it on the largest facial dataset to-date, an identity labeled dataset of four million facial images belonging to more than 4,000 identities, where each identity has over a thousand samples. The learned representations coupling the accurate model-based alignment with the large facial database generalize remarkably well to faces in unconstrained environments, even with a simple classifier.”

DeepFace is still in the research stage and has not been exposed to the 1.23 billion Facebook users.

The team, which plans to announce the program in June at a computer vision conference, said it released the research paper last week to solicit the opinions of other qualified experts and gauge public opinion as a whole. That could perhaps be motivated by the number of questions that were raised when Facebook announced 18 months ago that it had purchased the Israeli startup Face.com for a reported price of approximately US$60 million.

With this announcement, Facebook's acquisition of Face.com clarifies concerns about an Orwellian future inspired by news of the 2012 deal.

As Facebook's database develops, it's conceivable that within a few years you could see someone on the street, point your iPhone at her, and pull up a list of possible identity matches within seconds,” Slate technology blogger Will Oremus wrote at the time.

For now, Facebook only auto-suggests the identities of people who are among your friends. Still, the company will possess the information and capacity to identify and track people on a broad scale...Only the company's concern for your privacy will stand in the way.”

Facebook founder and CEO Mark Zuckerberg, who phoned US President Obama last week to complain about the National Security Agency surveillance policies, announced earlier this year that the company has been investigating how to best implement AI technology in the future.

The goal really is just to try to understand how everything on Facebook is connected by understanding what the posts that people write mean and the content that's in the photos and videos that people are sharing,” Zuckerberg said on a conference call with investors earlier this year, as quoted by Bianca Bosker of the Huffington Post. “The real value will be if we can understand the meaning of all the content that people are sharing, we can provide much more relevant experiences in everything we do.”