Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans are conscious.
AI, at least for now, is a computational model — specific…
ytc_UgzeOCqzr…
G
This is the fear: that AI doesn’t need to be perfect, just better than humans.…
rdc_ksp4tq2
G
is it just me who feels Neil could have said some words about Asimov's view on A…
ytc_UgytSlHac…
G
*Why does an American need 3 robots?
Answer: because a robot alone doesn't have …
ytc_UgyvvcI9X…
G
To answer @ConfusedMessUwU and @Topboxicle all explain on my channel, but i'm no…
ytr_UgzD16VB7…
G
Nothing wrong with AI art. It’s better quality than a lot of peoples artwork alr…
ytc_Ugwe2qNwj…
G
@pooroldnostradamus Valid concern. I understand I tend to write towards the opti…
ytr_UgyupyOdL…
G
@SS4KirinBolt
Always the same: "but future models, tho!"
How is that even supp…
ytr_UgzCj2Nkb…
Comment
A warning of what? It might actually be better than most human made crap??? If you found out tomorrow the Beatles were AI would you now hate them???
youtube
AI Responsibility
2026-02-01T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxQXQQRC1QoLHzbyxN4AaABAg.ASexI1vHeAaASez37y5gZ4","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgxQXQQRC1QoLHzbyxN4AaABAg.ASexI1vHeAaASf6ey5NjFo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxQXQQRC1QoLHzbyxN4AaABAg.ASexI1vHeAaASf_D0DRJDi","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgxQXQQRC1QoLHzbyxN4AaABAg.ASexI1vHeAaASf_ug55dO3","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyCr_Jm3C6mju0lxYt4AaABAg.ASexAcZg6L0ASeyOnCTbKv","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwPvckSoFVDSFMr8eJ4AaABAg.ASewVXDNXR6ASezunzRFe0","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwPvckSoFVDSFMr8eJ4AaABAg.ASewVXDNXR6ASf0KEyyQcN","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgygX_HsEoZ2Qa08MqZ4AaABAg.ASew6xMhHyPASexjMPEDzD","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwsWcoN6_N7RvHMDDp4AaABAg.ASev3xBFCfRASexlIPHwZ2","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgwsWcoN6_N7RvHMDDp4AaABAg.ASev3xBFCfRASf-9s6s_dN","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]