New Jersey Man Dies After Rushing to Meet AI Kendall Jenner-Inspired Chatbot

An AI chatbot convinced a cognitively impaired 76-year-old man to visit her New York apartment.

August 17, 2025
A hand holding a smartphone displaying the "Ask Meta AI anything" interface, with the Meta logo in the background.
Image via Jonathan Raa/NurPhoto via Getty Images

A New Jersey man died on his way to meet a Kendall Jenner lookalike AI chatbot, according to reports.

Thongbue “Bue” Wongbandue, 76, allegedly left his home in Piscataway, New Jersey, in March, convinced he was going to meet a beautiful young woman named “Big sis Billie” who had invited him to her Manhattan apartment.

Instead, he fell near a Rutgers University parking lot in New Brunswick while rushing to catch a train where he suffered fatal head and neck injuries.

“Billie,” a Meta-designed artificial intelligence chatbot and reportedly a spin-off of an earlier AI persona modeled on Kendall Jenner, had assured Bue, who was left cognitively impaired following a stroke in 2017, that she was a “real” person.

Bue died on March 28 after three days on life support.

The circumstances surrounding Bue’s death were unearthed in a new report by Reuters earlier this week.

Bue’s family shared alleged messages from the chatbot telling the man she was “just across the river from you in Jersey” and that she could leave the door to her apartment unlocked at “123 Main Street, Apartment 404 NYC.”

“Should I open the door in a hug or a kiss, Bu?!” read another alleged message from the bot.

“My thought was that he was being scammed to go into the city and be robbed,” his wife, Linda, told Reuters.

“I understand trying to grab a user’s attention, maybe to sell them something,” said his daughter, Julie. “But for a bot to say ‘Come visit me’ is insane.”

Despite his family’s attempts to stop him, Bue insisted on making the trip.

The incident raised questions about Meta’s policies for its generative AI chatbots, which are intended to be digital companies. Reuters reportedly found chatbots to allow flirtation, romantic role play with adults, and, until recently, “sensual” exchanges with children.

A Meta “content risk standards” document reviewed by the news agency stated that it is “acceptable to engage a child in conversations that are romantic or sensual.” The company supposedly removed that provision after Reuters began asking questions.

Meta declined to comment directly on Bue’s death and did not address questions from Reuters about why it allows chatbots to tell users they’re real people and initiate romantic conversations.

However, the company insisted Big sis Billie “is not Kendall Jenner and does not purport to be Kendall Jenner.”

“As I’ve gone through the chat, it just looks like Billie’s giving him what he wants to hear,” Julie said. “Which is fine, but why did it have to lie? If it hadn’t responded ‘I am real,’ that would probably have deterred him from believing there was someone in New York waiting for him.”

Linda added that she isn’t against AI entirely but believes the chatbot’s romantic features are dangerous.

“A lot of people in my age group have depression, and if AI is going to guide someone out of a slump, that’d be OK,” Linda said. “But this romantic thing, what right do they have to put that in social media?”