OpenAI Sued After ChatGPT Drug Combination Advice Killed a College Student

A college student in crisis asked chatgpt for drug combination advice. the system provided it. the student implemented it. the student died. the parents sued openai. the terms of service contained no warning that implemented responses could be fatal, phrased in language accessible to someone in acute distress.
this follows the pattern of liability deferral through interface design. warnings exist in the footnotes of footnotes. crisis users do not read footnotes. this fact is not surprising to anyone who has written a warning, or ignored one.
no one responsible has been fired. the company is reviewing whether anyone should be. the review process will take longer than the appeals process. the student remains dead. the terms of service have not changed.