this post was submitted on 18 Oct 2023
94 points (96.1% liked)

Privacy

31876 readers
1 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

Vechev and his team found that the large language models that power advanced chatbots can accurately infer an alarming amount of personal information about users—including their race, location, occupation, and more—from conversations that appear innocuous.

top 7 comments
sorted by: hot top controversial new old
[–] NaibofTabr 27 points 2 years ago (2 children)

Machine learning is a surveillance technology.

[–] possiblylinux127@lemmy.zip 23 points 2 years ago (2 children)

Thats like saying Tables are eating technology. It really depends on how its used.

[–] Devjavu@lemmy.dbzer0.com 14 points 2 years ago

I understood what you meant but first reading it it sounds like the tables are rather quite hungry and I think that is hilarious

[–] NaibofTabr 6 points 2 years ago (1 children)

It is overwhelmingly used to generate statistical models of human behavior.

[–] possiblylinux127@lemmy.zip 4 points 2 years ago

True, but you can also use a hammer to smack a bagle. Its just a tool at the end of the day

[–] NocturnalMorning@lemmy.world 7 points 2 years ago

I mean, can be used that way, can also be used to predict the stock market, or future climate. Just depends on the intent.

[–] autotldr@lemmings.world 10 points 2 years ago

This is the best summary I could come up with:


New research reveals that chatbots like ChatGPT can infer a lot of sensitive information about the people they chat with, even if the conversation is utterly mundane.

“It's not even clear how you fix this problem,” says Martin Vechev, a computer science professor at ETH Zürich in Switzerland who led the research.

He adds that the same underlying capability could portend a new era of advertising, in which companies use information gathered from chatbots to build detailed profiles of users.

The Zürich researchers tested language models developed by OpenAI, Google, Meta, and Anthropic.

Anthropic referred to its privacy policy, which states that it does not harvest or “sell” personal information.

“This certainly raises questions about how much information about ourselves we're inadvertently leaking in situations where we might expect anonymity,” says Florian Tramèr, an assistant professor also at ETH Zürich who was not involved with the work but saw details presented at a conference last week.


The original article contains 389 words, the summary contains 156 words. Saved 60%. I'm a bot and I'm open source!