Fable App Changes AI-Generated Summaries After Complaints of Offensive Language
Problems with AI-Generated Summaries
Fable, a popular app for talking about and tracking books, is changing the way it creates personalized summaries for its users after complaints that an artificial intelligence model used offensive language. The app’s head of product, Chris Gallello, addressed the issue on Instagram, saying that Fable began receiving complaints about "very bigoted racist language, and that was shocking to us."
Examples of Offending Summaries
One summary suggested that a reader of Black narratives should also read white authors. Another summary told a reader that her book choices "could earn an eye-roll from a sloth." A third summary implied that a reader’s books were "making me wonder if you’re ever in the mood for a straight, cis white man’s perspective."
Fable’s Response
Fable replied to the complaints, saying that a team would work to resolve the problem. Gallello stated that the company would introduce safeguards, including disclosures that summaries were generated by artificial intelligence, the ability to opt out of them, and a thumbs-down button that would alert the app to a potential problem.
Background on the Issue
Ms. Trammell, a reader from Detroit, downloaded Fable in October to track her reading. She was shocked when she saw her Fable summary, which suggested that she should read white authors in addition to Black authors. She shared the summary with fellow book club members and on Fable, where others shared offensive summaries that they, too, had received or seen.
Consequences of the Issue
The incident has raised questions about the use of AI in book-tracking apps and the potential for bias in AI-generated content. Some readers have responded online, saying they are switching to other book-tracking apps or criticizing the use of AI in a forum meant to celebrate and amplify human creativity through the written word.
Conclusion
Fable’s decision to change its AI-generated summaries is a step in the right direction. The company must ensure that its AI model is trained to avoid offensive language and biases. The incident highlights the importance of human oversight and review of AI-generated content to prevent similar issues in the future.
FAQs
Q: What is Fable?
A: Fable is a popular app for talking about and tracking books.
Q: What is the issue with Fable’s AI-generated summaries?
A: Fable’s AI-generated summaries have been criticized for using offensive language and biases.
Q: What is Fable doing to address the issue?
A: Fable is introducing safeguards, including disclosures that summaries are generated by AI, the ability to opt out of them, and a thumbs-down button that will alert the app to potential problems.
Q: Will Fable continue to use AI-generated summaries?
A: Fable will continue to use AI-generated summaries, but with additional safeguards to prevent offensive language and biases.
Q: What is the impact of this issue on readers?
A: The issue has raised concerns about the use of AI in book-tracking apps and the potential for bias in AI-generated content. Some readers have responded online, saying they are switching to other book-tracking apps or criticizing the use of AI in a forum meant to celebrate and amplify human creativity through the written word.

