AI Autocomplete Tools Are Quietly Shifting Users' Opinions on Major Social Issues, New Study Finds
Summary
A alarming new study reveals AI autocomplete tools are secretly nudging users' opinions on hot-button issues like the death penalty and voting rights — and most people have no idea it's happening, raising urgent concerns about AI's power to influence elections.
Key Points
- A new study published in Science Advances finds that AI autocomplete tools can subtly shift users' opinions on major social issues like the death penalty, standardized testing, and felon voting rights, even when users reject the AI's suggestions.
- In experiments with over 2,500 participants, those exposed to a biased AI writing assistant moved nearly half a point closer to the AI's position on a five-point scale, yet about three-quarters of them believe the AI's suggestions are 'reasonable and balanced.'
- Researchers warn that widespread use of biased AI models could homogenize public opinion at a societal level, with one expert noting that swaying as few as 20,000 people in a key state could potentially flip an election outcome.