Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless use...

Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless users choose to opt out. It’s also extending its data retention policy to five years — again, for users that don’t choose to opt out. All users will have to make a decision by September 28th. For users that click “Accept” now, Anthropic will immediately begin training its models on their data and keeping said data for up to five years, according to a blog post published by Anthropic on Thursday. The setting applies to “new or resumed chats and coding sessions.” Even if you do agree to Anthropic training its AI models on your data, it won’t do so with previous chats or coding sessions that you haven’t resumed. But if you do continue an old chat or coding session, all bets are off. Image: Cath Virginia / The Verge, Getty Images

Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless use...
Why the Hen Does Not Have Teeth Story Book

WHY THE HEN DOES NOT HAVE TEETH STORY BOOK

It’s an amazing story, composed out of imagination and rich with lessons. You’ll learn how to be morally upright, avoid immoral things, and understand how words can make or destroy peace and harmony.

Click the image to get your copy!

Why the Hen Does Not Have Teeth Story Book

WHY THE HEN DOES NOT HAVE TEETH STORY BOOK

It’s an amazing story, composed out of imagination and rich with lessons. You’ll learn how to be morally upright, avoid immoral things, and understand how words can make or destroy peace and harmony.

Click the image to get your copy!

Why the Hen Does Not Have Teeth Story Book

WHY THE HEN DOES NOT HAVE TEETH STORY BOOK

It’s an amazing story, composed out of imagination and rich with lessons. You’ll learn how to be morally upright, avoid immoral things, and understand how words can make or destroy peace and harmony.

Click the image to get your copy!

Anthropic will start training its AI models on user data, including new chat transcripts and coding sessions, unless users choose to opt out. It’s also extending its data retention policy to five years — again, for users that don’t choose to opt out.
All users will have to make a decision by September 28th. For users that click “Accept” now, Anthropic will immediately begin training its models on their data and keeping said data for up to five years, according to a blog post published by Anthropic on Thursday.
The setting applies to “new or resumed chats and coding sessions.” Even if you do agree to Anthropic training its AI models on your data, it won’t do so with previous chats or coding sessions that you haven’t resumed. But if you do continue an old chat or coding session, all bets are off.
Image: Cath Virginia / The Verge, Getty Images

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow