Google Nest Wants Your Friends’ Faces: Should You Give In?

We’re no strangers to face identification technology. iPhone users have been creating face profiles for Apple for years, and now biometric face scans help us complete all kinds of tasks, like unlocking your phone or even paying for groceries. Companies are quick to assure us that face scan data is encrypted and not handed out to third parties, but some privacy concerns linger — and no example is so pertinent as “familiar face” home security technology, which I discuss in my recent feature on home AI.

Security companies like Google Nest and SimpliSafe are using face scan technology that allows users to create a face contact list for friends and family, using pictures you already have. That data then gets used in video doorbells and home cameras. But instead of making face identification decisions for yourself, you’re making them for other people, possibly without their consent. That’s a new can of worms, and it deserves a closer look to anyone investing in home security devices.

The basics behind ‘familiar face’ technology

A smartphone in someone's hand showing a landscape view of the Ring Doorbell live view with a woman waving on the screen.

Doorbells like Ring’s can let you know if a human approaches, but algorithms can go farther.

Ring

Nest was the first, but now other security companies are starting to offer their own face recognition options. The process is similar across brands: Subscribe to a service like Nest Aware ($8 per month) or SimpliSafe’s beta program for professional home monitoring with AI, and face recognition is enabled on compatible home cameras and video doorbells. In Nest’s case, it works with compatible systems like ADT, too.

You then have the option to use face photos to create a sort of library of recognized and named faces. The cams’ algorithms will do their best to analyze approaching faces and report if they recognize anyone. Nest is quick remind people to check local privacy laws (we’ll get to that part below) and get a person’s permission first, although there’s no requirement to take that step. SimpliSafe suggests using it for encounters as casual as your dogwalker, so its security monitoring agents can focus on strangers when they take a look.

“Over time, facial recognition becomes more accurate at determining familiar faces and can send you more helpful alerts,” said Julie Zhu, product manager for Google Nest. “We can also see a future where Nest users build automations related to a specific person, like personalized doorbell chimes.”

A hand holds a smartphone above wood floor, showing the Google Home app with a Nest cam view and option to talk.

Alerts for familiar faces provide some usability, but slim benefits for the work involved.

Google/Amazon

Doorbell chimes may be a tough sell for homeowners thinking about logging their friends’ faces. On the security side, Zhu also pointed out that the software lets Nest notify users if an unrecognized face shows up at the door, which could be a feature for people who have problems with strangers. On the other hand, it’s easy to envision a scenario where you really want to know if it’s nosy Aunt Carol at the door, or if your unpleasant ex has made an unexpected visit.

Overall, we’re not entirely convinced those features are worth signing up everyone’s face. Smart home integrations like automatically unlocking doors for certain faces have more appeal, but this kind of compatibility is slow to form, and privacy issues remain.

Privacy, face data and big questions

A mock delivery man holding packages stands for a Kasa doorbell live view.

Creating face profiles for family is understandable, but creating them for workers or acquaintances starts feeling odd. 

Kasa

Two big questions arise when considering face ID technology: Can it be stolen, and can law enforcement use it for tracking? Security vulnerabilities and data breaches are, sadly, not going anywhere, which is why storing and encrypting face data on-device is so important. Companies like Google do store face data in the cloud, but it’s encrypted first so that no one from Google can access face data directly. There may still be an option for companies to use face data to train their AIs even if they can’t access it directly.

The second question is thornier. Facial recognition and law enforcement are an unstable concoction with plenty of heated opinions. Federal agencies and local law enforcement both use face recognition, but with startlingly few regulations in place.

So, could cops demand your face library as a way to track potential suspects or anyone else they wanted? While cops can request home security data in case of life or death emergencies (or warrants), we don’t have any reports of them asking for face profiles. There’s not really a way for law enforcement to collect and process these face profiles in the first place, and even if there was, companies don’t typically keep that data in their cloud and unencrypted. So it’s not a significant concern — at this time.

Image of delivery person and package taken from Ring Battery Doorbell Plus

Arlo’s line of doorbells contains many detection features, but no face profiles.

Ring

More worrying are the trust issues involved in managing a portfolio of family and friend faces, then letting your cameras’ algorithms use them automatically. Nest’s familiar face tools do limit the ability of children under 13 to use or change them, but adults can use them however they want as long as they have photos to work with. And you’re incentivized to use more faces so the detection features can learn and become more accurate over time.

With Apple Intelligence already looking through our phones’ photos to generate images, we’re getting more comfy with AI seeing our pics. But it can still feel like a violation of privacy to some people when video doorbells or cameras get involved. And that’s causing some governments to take action.

Laws are catching up to face recognition, but it’s unpredictable

A man looks at a smart home hub on a white all that is scanning his face.

As privacy laws grow more common, we may see familiar face technology largely banned.

IG Photography/Getty Images

Out of all the detection options available with today’s algorithms, facial detection does the best at summoning fears of sci-fi dystopias. That’s one reason it’s also a leading driver in creating laws about AI and face privacy, which are still sporadic but growing.

While federal regulations remain slim to none, we now have executive orders requiring agencies to review all their face recognition tools. The European Union is going a step further and banning real-time face recognition in public for any reason.

Locally, states and cities in the US have stepped up to fill the gap. Illinois has passed some of the toughest biometric privacy laws, which has led Google to block the familiar-face service from the state entirely. Meanwhile, privacy laws in not only Illinois but also Texas and Portland, Oregon, keep SimpliSafe from offering its AI-enhanced guard services with face recognition in those areas.

A woman uses a small screwdriver to install a Ring Video Doorbell on white trim.

Installing a Ring video doorbell.

Ring

We’re already seeing some workarounds for these growing privacy laws. Eufy, for example, told me its team was interested in using parent company Anker’s voice-recognition capabilities in its home security products. Would you prefer saving voice profiles instead of faces? ScarJo may not be a fan, but some homeowners may feel safer with this option.

We’re keeping tabs on this kind of face recognition, and we’ll let you know if concerns arise or new legislation starts to have a noticeable impact on the field. For now, use this kind of technology judiciously, and try to ask permission first.

For more information on home security and privacy, take a look at our guides on the best ways to deter burglars, if it’s legal to record video and audio in your home and the spots to never put a home security camera if you want to avoid lawsuits.