Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So, AI is learning the pros and cons of human thinking and behaviors. so we/they…
ytc_Ugx0io_q9…
G
बस यही हाल हमारा भी है, जिसे हम सुंदर समजते है अंदर से वो भी कुछ ऐसा ही है, चमड़…
ytc_UgwTZKCUI…
G
Only good thing ai does is make movies more accessible with subtitles otherwise …
ytc_UgyXXgRRv…
G
0:38 it is a mix of people who recognise the similarity in how LLMs and humans p…
ytc_UgwdaOSj1…
G
he prinicpally misunderstands what ai is. Ai is not ai it is aips. artificial …
ytc_UgyB51T6S…
G
This is literally how 2030 is gonna be everything’s gonna be AI I’m scared becau…
ytc_UgwTxbRrP…
G
On top of that cheap manufacturer's are now claiming that anything that responds…
ytc_UgzqbKp4l…
G
Eventually, many people could work by using VR/RC to teleoperate a robot from t…
ytr_Ugyzvs-0A…
Comment
This will probably already have been mentioned, but one scenario would be that humanity just gets wiped out because AI's make climate change worse to try and solve it. We might see a nuclear winter style change, because it reasons it should lower overall temperature to keep running the best it can and solve its problems in 1 shot.
Like they said in the video, there's millions of ways this could end badly.
youtube
AI Moral Status
2026-02-08T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzJrN25Teyc-btld014AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzhD0gSRExJAwS0zah4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy04Sg04fwpiuLQ4894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxaYhBPs4zE_99anzN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwuENLS4A5s89JlVR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9F7Le4mP8wIJOi5N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyTXgAVrJyhNmWFsIt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFYtpcIqNRv9ueuzd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyo4GC6ns59ypmOq-N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxoezlHXCPGTlCG_dh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]