The Home Office’s new “ChatGPT-style” LLM tool is riddled with mistakes that could mean life or death for people seeking asylum
The tool got things wrong, or was missing important information, almost one of every 10 times it was used.
Area of work: Challenging unfair algorithms