AI trains on novels track how racist and sexist biases have developed

AI trains on novels track how racist and sexist biases have developed


When they were published, books can document the cultural prejudices of that era.

Antelor/Almi

Artificial intelligence that raises sexist and racist prejudices is a famous and frequent problem, but researchers are now turning to their advantage to analyze the social outlook through history. Training the AI ​​model on novels from a certain decade can inspire them with prejudices of that era, it offers a new way to study how cultural prejudices have developed over time.

Large language models (LLM) such as Analysis of Analysis by analyzing large collections of Chatgpt text are learned. They inherit the prejudices found within their training data: …

(Tagstotransite) AI