Amazon’s latest update to its Ring doorbell technology introduces a contentious AI-driven feature known as “Familiar Faces,” which can recognise and identify visitors at your door. Announced earlier this month, this feature is now being rolled out to Ring users in the United States.
The Familiar Faces feature allows users to curate a list of up to 50 frequently seen individuals, such as family, friends, delivery personnel, and domestic staff. When you label a face in the Ring app, the device will automatically recognise that person when they approach, providing personalised notifications like “Dad at the Front Door” instead of generic alerts.
However, the feature has triggered significant pushback from consumer protection advocates and lawmakers, highlighting growing concerns over privacy. Notably, it allows users to tailor notifications, so they can, for example, choose not to be alerted when they themselves appear in their own footage. This customisation is controlled through the app settings, with faces easily labelled, edited, or deleted from the interface.
Despite Amazon’s assurances that facial data is encrypted and only shared within the Ring ecosystem, scepticism remains. Critics point out the company’s checkered history with law enforcement, including partnerships allowing police access to user footage. A report detailed Amazon’s collaboration with companies like Flock, which supplies AI surveillance to police, sparking fears that the facial-recognition data could become accessible to authorities.
Furthermore, Amazon’s data security has come under scrutiny following a $5.8 million fine for inadequate protections surrounding user video access, raising doubts about the reliability of their privacy promises. The risk of sensitive information exposure in the past, such as user addresses and passwords surfacing on the dark web, compounds these concerns.
As a result, some lawmakers, including Senator Ed Markey, are advocating for the abandonment of the feature altogether. Privacy restrictions mean the Familiar Faces capability is not available in specific areas, including Illinois, Texas, and Portland, Oregon.
While Amazon insists that biometric data processing is conducted in the cloud without training AI models, doubts persist about their technical infrastructure’s integrity—especially when juxtaposed with other features that track users across neighbourhood networks.
In light of these developments, the Electronic Frontier Foundation (EFF) is calling for regulatory bodies to scrutinise the implications of this feature closely. They argue that basic privacy rights should not be compromised by technology updates that can potentially monitor everyday interactions, urging caution for users who may wish to utilise the facial recognition technology in their homes.
In summary, while Amazon positions the Familiar Faces feature as a useful innovation, the combination of privacy fears, security flaws, and law enforcement collaborations raises vital questions about the balance between convenience and personal privacy in the era of AI technology.
Fanpage: TechArena.au
Watch more about AI – Artificial Intelligence


