Former Meta Exec Claims Requiring Artist Consent Would 'Kill' UK's AI Industry
Summary
Former Meta executive Nick Clegg sparked controversy by claiming that requiring AI companies to obtain permission from artists before using their work to train models would 'basically kill the AI industry in this country overnight,' reigniting debates over AI ethics and intellectual property rights.
Key Points
- Nick Clegg claimed asking for permission from rights owners to train AI models would 'kill' the UK's AI industry
- Clegg said it's not feasible to ask for consent before using artists' work to train AI models
- UK lawmakers rejected an amendment that would require disclosing copyrighted works used to train AI models