Ring cameras new Familiar Faces tool violates state privacy laws, privacy experts say

On Sept. 30, Amazon announced new Ring security cameras and AI features at an event in New York City. Now, the Electronic Frontier Foundation says that one of those AI features, a tool called Familiar Faces, "has the potential to violate the privacy rights of millions of people and could result in Amazon breaking state biometric privacy laws."
Mashable attended the Sept. 30 event, where Ring founder and leader Jamie Siminoff provided an overview of the Ring product line's new AI feature set. Siminoff only recently returned to Ring, and he has made a focus on AI one of his top priorities.
He's also renewed Ring's work with law enforcement, a source of controversy in the past, after the company backed away from this approach in recent years. Ring was founded by Siminoff in 2013, and the entrepreneur eventually sold the company to Amazon for $1 billion. However, Siminoff left the company in 2023 before returning in 2025.
What is Familiar Faces, and how could it violate privacy laws?
Like virtually every tech company, Ring is embracing AI. That includes the new Familiar Faces tool, which gives Ring cameras the ability to recognize trusted friends, neighbors, or family members to provide more personalized notifications and home monitoring. (Or, as Amazon describes it, "Familiar Faces intelligently recognizes familiar people and empowers customers to reduce notifications triggered by familiar people's routine activities.")
To do this, Ring cameras conduct facial scanning on people who enter the camera's view, without their consent or knowledge. Ring camera users can turn the feature on or off, however.
This week, the EFF said that the tool could potentially violate state privacy laws that require consumers to actively opt in to tools like facial scanning.
"Many biometric privacy laws across the country are clear: Companies need your affirmative consent before running face recognition on you," writes Mario Trujillo in a new EFF report.
Trujilo also points out that Amazon has already confirmed "the feature will not be available in Illinois and Texas — strongly suggesting its feature could not survive legal scrutiny there."
"Amazon says it will provide in-app messages to customers, reminding them to comply with applicable laws. But Amazon—as a company itself collecting, processing, and storing this biometric data—could have its own consent obligations under numerous laws," Trujilo writes.
Ring's controversial history with user privacy
Ring, the Amazon-owned brand of video doorbells and smart security cameras, doesn't have a great track record with user privacy. In fact, that's probably understating the problem.
Mashable has reported on the company's repeated privacy controversies. Progressive critics have also taken issue with the company's work with law enforcement over the years, which included sharing users' footage with police without their knowledge or consent, and absent a warrant or subpoena.
Most notably, in 2023, the FTC accused Ring of allowing employees and contractors to watch users' private videos, which ultimately resulted in a settlement.
Despite this, Ring remains an extremely popular brand, and the company's video doorbells and security cameras can be found in millions of homes. And while critics take issue with the company's close work with law enforcement, some customers may actually view this as a positive feature for a home security company.
What does Amazon say about Familiar Faces and privacy?
The EFF isn't the only notable critic of Familiar Faces. Democratic U.S. Senator Ed Markey sent Amazon a letter on Oct. 31 calling on the company to abandon its plans for facial recognition.
In his letter, the senator wrote:
“Although Amazon stated that Ring doorbell owners must opt in to activate the new facial recognition feature, that safeguard does not extend to individuals who are unknowingly captured on video by a Ring doorbell camera. These individuals never receive notice, let alone the opportunity to opt in or out of having their face scanned and logged in a database using FRT. To put it plainly, Amazon’s system forces non-consenting bystanders into a biometric database without their knowledge or consent. This is an unacceptable privacy violation.”
The EFF also sent Amazon a list of questions about the Familiar Faces feature, including whether it would be available in states that require opt-in consent to process sensitive data such as facial biometric scans.
In response, the company wrote, "Customers are expected to use our products and features in accordance with law. We display a message in-app to remind customers that they should comply with applicable laws that may require obtaining consent prior to identifying people.”
In addition, Amazon told the EFF that the processing for Familiar Faces happens in the cloud, not on device, though with appropriate security measures.
“Ring's Familiar Faces feature happens in the cloud, not on the device. We implement comprehensive security measures including encryption for data at rest and in transit, access controls, and database isolation to protect user biometric data. Users maintain control over their profiles with the ability to delete any profile at any time, resulting in removal of associated biometric data.”
While privacy advocates and Ring's critics take issue with Familiar Faces, it's not clear how many Amazon customers share these reservations. Indeed, for customers worried about crime, facial recognition and close ties with law enforcement may be a feature, not a bug.
Familiar Faces is set to launch in December, according to an Amazon blog post.