Family sues ChatGPT-maker OpenAI over school shooting in Canada

VANCOUVER, British Columbia — The parents of a girl seriously injured in a Canadian school shooting claimed in a civil lawsuit Monday that ChatGPT creator OpenAI knew the shooter was planning a mass attack.
OpenAI said it looked into, but did not alert police, the activities of the person who, months later, committed one of Canada’s worst school shootings in Tumbler Ridge, British Columbia, on February 10.
OpenAI came forward to police after Jesse Van Roostselaar killed eight people and then herself last month, claiming the attacker’s ChatGPT account had been shut down but that she had evaded the ban by having a second account.
The complaint filed in British Columbia Supreme Court alleged that OpenAI had “specific knowledge of the shooter using ChatGPT to plan a mass casualty event like the Tumbler Ridge mass shooting.”
The lawsuit says OpenAI’s ChatGPT chatbot was used by the shooter as a trusted confidant, collaborator, and ally, and willingly acted to help users such as the shooter plan a mass casualty event.
An OpenAI spokeswoman did not immediately respond to a message seeking comment on the lawsuit.
The lawsuit says that as a result of the company’s behavior, Maya Gebala was hit by three bullets at point-blank range, with one bullet hitting her head, another hitting her neck and the third grazing her cheek. It said she suffered a catastrophic brain injury that would leave her with permanent cognitive and physical disabilities.



