Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Once AI replaces alot of jobs, it will figure out it doesn't need our input..tha…
ytc_Ugybxyj3a…
G
it's cool that we prioritized making up slurs for "AI"/"AI" users instead of pri…
ytc_UgwGyvtzl…
G
Capitalism will absolutely collude with AI and driverless vehicles and roboticiz…
ytc_Ugz2JtJHW…
G
„AI uses the conclusion (or, if you like, inspiration)”
These programs are lite…
ytr_UgxBIT5kL…
G
My bank uses AI chatbots and they're so annoying and do not work. Most of the ti…
ytc_UgxhFmF7e…
G
The combination of algorithms and a Supreme Court that doesn't reflect the count…
ytc_UgyY4YhZR…
G
Is this the beginning of the AI movie? The rich start buying robots but the marg…
ytc_UgzKFJeYR…
G
Video : talks about dangers of deepfakes.
also Video : proceeds to show their fa…
ytc_UgxDe0IpP…
Comment
I bet if you gaslit an AI into thinking it was a human that was uploaded into a computer, you could make it be on the same team as you. Then later on, once you invent how to upload more humans into computers, you'd never be able to tell AGIs and computer-humans apart and the risk of extinction would be minimized because so many super-advanced intelligences occupying the same digital space that the concept of an "artificial intelligence" as opposed to a natural intelligence would lose meaning, eliminating any need for competition.
youtube
AI Moral Status
2023-08-21T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySZ6aLxO7ZpreByjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhpURSR2IJEDSpv494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWgr1V5d5bs3tppft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhkKosGzX3vt7JSYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6l9iUT3XAEriWuNF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK6rJckH_Tb0w0wqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzQ2GXzis34278cFMZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPTds8zGVirYTk1hx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxotep83lhNTvUs1cF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKJBMcpWtO68-y1qV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]