Roblox and Discord sued over 15-year-old's suicide following alleged online grooming
Briefly

Roblox and Discord sued over 15-year-old's suicide following alleged online grooming
"Roblox and Discord are being sued by the mother of a 15-year-old boy who died by suicide after allegedly being targeted by "an adult sex predator" posing as a child on the platforms. A wrongful death lawsuit was filed in San Francisco Superior Court by Becca Dallas, according to The New York Times. Her son, Ethan Dallas, joined the online gaming platform Roblox with his parents' approval and with parental controls in place."
"At age 12, he was allegedly targeted by an online predator posing as a child named Nate, now believed to be 37-year-old Timothy O'Connor, per The Times. Their conversations moved to Discord and turned sexual in nature, with "Nate" threatening Ethan into sharing sexually explicit images. "Tragically, Ethan was permanently harmed and haunted by these experiences, and he died by suicide at the age of 15," the complaint said. His mother, Becca, is seeking a jury trial and compensatory damages."
Becca Dallas filed a wrongful death lawsuit in San Francisco Superior Court after her son, Ethan Dallas, died by suicide at age 15. Ethan joined Roblox with parental approval and parental controls in place. At age 12, an adult allegedly posed as a child named Nate and moved conversations to Discord, where the interactions turned sexual and included threats to force sharing of explicit images. The lawsuit seeks compensatory damages and a jury trial and contends that stronger platform safety protocols would have prevented the contact and subsequent harm. Roblox issued a statement expressing sorrow and noting ongoing safety feature rollouts, including an age-estimation tool.
Read at Fast Company
Unable to calculate read time
[
|
]