We welcome a report from the Committee on Standards in Public Life as the start of an essential debate we need to have about how algorithms are affecting important decisions being made in the public sector – but some technologies, such as facial recognition, are simply too harmful to regulate and must be banned.
We learned about the struggle to combat the Huduma Namba system which could worsen discrimination faced by the Nubian ethnic minority.
The leaked document proves that tech firms know content moderation can cause PTSD and are seeking to avoid responsibility by making workers sign release forms.
Thousands of people are employed in what are effectively the tech equivalent of the fashion industry’s sweatshops, with many suffering serious harm as a result of the poor working conditions and lack of support and training.
We teamed up with a former content moderator and Amnesty’s Deputy Director of Tech to explore how the human rights of content moderators can be protected.
Today Foxglove launched our first legal challenge in the UK! We’ve teamed up with the Joint Council for the Welfare of Immigrants to challenge the Home Office’s computer algorithm for making decisions about visitor visa applications.