Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I personally think the entire augmentated AI conversation pushed by for example …
ytc_UgzcKjqaO…
G
Nothing to fear from AI. Only the people creating, updating and directing indiv…
ytc_UgxSesjWE…
G
Let's hope A.I. doesn't turn on us and decides to wipe us all 🤞
We still haven'…
ytc_UgyR9ECp5…
G
We have the perfect setup for AI to come in and destroy the world. After covid, …
ytc_UgzwEWiWx…
G
@levacarvalho Yes. Art will become a hobby but the professional work will be don…
ytr_UgyRWS-aX…
G
I more like to look back fondly when AI and bot farm technology wasn’t as good a…
rdc_ohi2tux
G
i'd seen the general "criticizing AI is ableist" thing before but i had no idea …
ytc_Ugxx6dpFo…
G
My goodness. Reading these comments is terrifying.
1. These people probably ha…
ytc_UgjXoPJGX…
Comment
Dangerous AI doesnt even need to be sentient. See the paperclip maximiser from Isaac Arthur. In essence, make paperclips out of everything. If non sentient gain sentience to maximise paperclip production. Or not, instead start a paperclip cult among humans to soften opposition. All will be paperclips. The only thing such a counciousness(?) will regred is that at the end of it all it cant convert the last bit of its factories into paperclips, as it has strategically canibalised even itself for maximum production.
Such an alien mind could be even more ailien than actial aliens, as at least we share similar evolution with such hypothetical beings. And THAT scares me about AI.
youtube
AI Moral Status
2023-11-03T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzE1c5ofxynvnyLyFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgwnnmGcGxZyzFqOXgZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwlZx7TJKcBZIh_1F14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugz5EbtR2-fgTZJRcP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwclBtt66cS8BEUwzV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgwOxhpWA7i-pC5SPp54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgweFSf2gd1BABEr2Gt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugws-hTKST4ANFG_za14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwMW2b7rwsQY5gttsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzQfomn-aTY-pElq3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}]