Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The male robot ( Hansen) is far more realistic than ( Sophia) he also has a craz…
ytc_Ugx0_aR_I…
G
If he's been on Rogan and the grifter circles...he's a bs artist. AI bubble read…
ytc_Ugwf1Z37A…
G
It will likely work for years to come, how long is uncertain, as they might spen…
ytc_UgynUIylT…
G
There is already problems.... If the ai suddenly dicide it wont count too 100000…
ytc_Ugz7Uu1cd…
G
> now that many artists are aware of the dangers of ai and are making efforts to…
ytr_UgyHlp75y…
G
What developments are they hiding? AI can’t even perform as well as a senile hum…
ytc_Ugwv-fl7F…
G
@rqlk - Uber was doing autonomous vehicle testing several years ago here and had…
ytr_UgynZwmPH…
G
i hate AI, theyre ruining our art community. to all the people who use AI, youre…
ytc_Ugwa40Q0K…
Comment
I believe the shift from search engines to chatBots could significantly increase privacy challenges.
People often use search engines to explore highly personal topics that they would not like to share with friends, family, coworkers, or anyone. However, search queries are terse and typically do not reveal a ton of information about the person searching.
Now suppose a lot of people switch from search engines to chatBots. Now, instead of discussing those highly-personal topics through cryptic keyword searches, they're going to be entering whole sentences, including background information, spread across multiple-turn conversations.
As a result, your dialogs with a chatBot will likely identify you far more clearly and reveal far more about you than your searches ever did.
So... where do those sensitive conversations with chatBots go? Well, they're almost certainly being logged by the company running the chatBot. Some fraction of those logs will be studied by people in search of product improvements. They may also be processed automatically by many researchers as a form of training data. Either way, probably a lot of people are going to have access to your conversations with your favorite chatBot.
Now, if a lot of people have access, then there is a real possibility that they could be inappropriately accessed ("I wonder what that girl/guy I secretly like is chatting about???") or even leaked publicly by a malicious or injudicious insider or by external hackers.
What makes me especially nervous is that search engines are operated by large companies with world-class information security teams (Microsoft, Google). In contrast, chatBots are often run by startups with little or no security expertise. So I bet the probability that chatBot conversations leak publicly is probably far, far higher than search histories and they are probably far more sensitive.
So... that's kinda scary! So I think there are really serious privacy concerns around chatBots that call f
reddit
AI Governance
1680292053.0
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_jehef1c","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"rdc_jeg69ed","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"rdc_jeg9t6j","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"rdc_jegugm2","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"rdc_jeeryee","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]