Screen time also if I do not take my data for AI training they are used

Even if I do not take my data for AI training, they are used

In Screen Time, Tech reporter Rutger Otto writes on the internet every week. This week he objects at Meta to using his personal data to train AI. And he shares your comments on last week’s blog.

The Dutch privacy supervisor calls on to take action if you don’t want Meta to use your messages and photos to train the AI.

Meta would like to collect as much data as possible to better understand “the various nuances and complexity of European users”. For example, it is about interpreting dialect and jokes. For people it is sometimes difficult to understand sarcasm, let alone a computer.

Make no mistake, Meta Ai, but also competitors such as Chatgpt, already have a huge amount of data. Professor Ai Frank van Harmelen from the VU University in Amsterdam says that the software of these companies has read all public information from the internet. “Everything that is online is learned,” he says. “The internet is finished. So Meta has to tap into new sources, such as private conversations on Instagram.”

That can be the case. I prefer to keep private conversations exactly: private. The same applies to photos that I have shared on my private account. I know that I am the product on services that I use for free, but there are limits.

“My brother-in-law found out in a conversation with the AI-Chatbot that the chatbot knew with whom he has a relationship,” says AI researcher Roel Dobbe at TU Delft. “That is personal information that ended up in data sets and that is marketed by developers.”

So there is already a lot of information in the training dates of AI developers that we know nothing about. Because the companies do so secretively about it, you will not find out what exactly they know and what they will use it (ever) for.

Dobbe is not comfortable that Meta has chosen an opt-out (‘Use my data unless I object’). “This is how the responsibility is placed with the end user. While it should actually be: the polluter pays.”

Or as Lotje Beek of civil rights movement Bits of Freedom describes it: “It is as if a rich guest walks in your house, picks up your photo albums, walk out and calls out:” Ren but after me if you think this is not allowed. “

While the Dutch Data Protection Authority wonders whether it is allowed at all what Meta does, I do what I can do. So make an objection. Although I know that I will never succeed in staying outside the AI ​​models of Meta. Because if friends do not object and place a photo that I stand on, my face ends up in the dataset via that road.

And how do I get my data out again? That will be an impossible mission. “Your information disappears in a huge mush and can never be found again,” says Van Harmelen. “But who knows, your quotes will appear in conversations that Meta’s Chatbot has with others. You may not like that, certainly not when it comes to sensitive information.”

In Screen Time , Tech Reporter Rutger Otto Writes Weekly about the Internet. This week, the objects to meta using his personal data to train ai. And he shares your reactions to last week’s blog.

The Dutch Privacy Watchdog is Calling for Action if you don’t want Meta to use your messages and photos to train ai.

Meta Wants To Collect As Much Data As Possible to Better Understand “The Various Nuances and Complexity of European Users.” This Includes, For Example, Interpreting Dialect and Jokes. It is Sometimes Difficult for People to Understand Sarcasm, Let Alone A Computer.

Make no fogke, meta ai, but also competitors such as chatgpt, already have a huge amount of data. Professor of Ai Frank van Harmelen from the VU University in Amsterdam Says That the Software of these Companies Has Now Read All Public Information from the Internet. “Everything that is online lean learned,” he says. “The internet is up. So meta needs to tap into new sources, such as private conversations on Instagram.”

That may be so. I prefer to keep private conversations Exactly that: private. The same applies to photos that I have shared on my private account. I know that I am the product on services that I use for free, but there are limits.

“My Brother-in-Law Found Out in a Conversation with the Ai Chatbot That the Chatbot Knew Who He was in a Relationship With,” Says ai Researcher Roel Dobbe at Tu Delft. “That is personal information that has ended up in datasets and is being marketed by developers.”

So There is Already A Lot Of Information in the Data or AI Developers training that we don’t know about. Because the companies are so secretive about it, you don’t find out exactly what they know and what they (ever) are going to use it for.

Dobbe is not happy that meta has opted for an opt-out (‘use my data unless i object’). “This Places the Responsibility on the End User. While it should actual be: The Polluter Pays.”

Or as lots beek from civil rights movement bits of freedom describes it: “It’s like a rich guest walks into your house, grabs your photo albums, walks outside and skouts:” run after me if you don’t think this is allowed. “”

While the Dutch Data Protection Authority Wonders Whether What Meta Is Doing is equally Allowed, I Do What I Can. So object. AltheHe I Know That I Never Be Completely Successful in Staying Out of Meta’s AI Models. Because if friends don’t object and post a photo of me, my face will still end up in the dataset via that route.

And how do i get my data out again? That will be an impossible mission. “Your Information Disappears Into A Huge Mess and Can No Be Found Again,” Says van Harmelen. “But who knows, your quotes may appear in conversations that meta’s chatbot has with ethers. You may not like that, eSpecially if it groups sensitive information.”

Scroll to Top