Stephen Schenck / Android Authority
TL;DR
- Google Home users report their cameras describing fictional people.
- One summary said “Michael was seen taking out the trash,” while another labeled a man living alone as “Sarah.”
- Google provided a statement explaining that it is actively investing to improve the accuracy of facial identification.
Update: October 28, 2025 (2:15 PM ET): In response to our original article below, a Google spokesperson reached out to us with the following statement:
Gemini for Home (including AI descriptions, Home Brief, and Ask Home) is in early access so users can try these new features and continue giving us feedback as we work to perfect the experience. As part of this, we are investing heavily in improving accurate identification. This includes incorporating user-provided corrections to generate more accurate AI descriptions. Since all Gemini for Home features rely on our underlying Familiar Faces identification, improving this accuracy also means improving the quality of Familiar Faces. This is an active area of investment and we expect these features to keep improving over time.
The statement may not fully address how these particular users came to be known as Michael and Sarah, but at least Google is acknowledging that Gemini for Home’s facial recognition isn’t perfect at this stage and is working to improve it.
Original article: October 27, 2025 (12:30 PM ET): With Halloween week upon us, it probably isn’t the best time for your smart home to start suggesting that people you aren’t familiar with have been wandering your property. However, a couple of Reddit users claim their Google Home devices have started describing people they don’t know.
According to a post on the r/googlehome subreddit, a user’s Nest camera described an activity summary saying “Michael was seen taking out the trash,” even though no one by that name lives there. When the user asked about it, Google’s assistant reportedly replied that its camera “can identify faces even if you haven’t explicitly named them,” and that it had spotted “Michael” between October 26 and 27. The user said they’d never entered that name into the system, but the actions described were what he himself had done. He found the response “pretty creepy.”
Don’t want to miss the best from Android Authority?


While one commenter in the thread chirped up with a Halloween joke about it being Michael Myers, others shared similar oddities. One person said their Google Home randomly reminded them to take out the trash, denied ever doing so, and then insisted they must be mistaken, prompting the joke that it was an example of Google gaslighting. Another claimed that his device reported that his friend David had visited and vacuumed the living room. The user has a friend by that name, but that person hadn’t visited, and no one had vacuumed.
Their camera generated a summary of their day, but identifying them as a woman named Sarah.
A second Redditor, posting a few days earlier, claimed their camera generated a summary of their day, but identifying them as a woman named Sarah. The AI even described “Sarah” taking out the bins and walking her dogs. “I am a man, live alone, and don’t even have a friend named Sarah,” the user wrote. When he questioned Gemini about it, he got the confusing response, “You told me that Sarah is actually {my name} today, but that I invented the name Sarah yesterday. How can both of these things be true?” The system also suggested two dogs had been active in the living room, when the user only owns one.
Neither case has been verified, and no widespread pattern has emerged beyond one other commenter mentioning a similar experience of being referred to as Kevin. Still, the official Google Nest Community account did respond to one of the posts, asking for a clip of the incident so the team could investigate further.
One commenter suggested that the behavior could simply come down to how large language models work. If the system predicts that a particular name best fits the context of what it’s describing, it may generate that name even if it isn’t real, since it can’t actually distinguish between an accurate description and an invented one.
Assuming no fault on the part of the users, Gemini on Google Home hallucinating is a reasonable assumption. However, both the amused and unnerved reactions are entirely understandable.
Thank you for being part of our community. Read our Comment Policy before posting.
