Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think AI will ever have consciousness, but I hope it won't end up in the…
ytc_UgzVLVB71…
G
The problem is, there’s now no natural limit on how many “influencers” can have …
ytc_UgzEBJv14…
G
his insta is private now and on his threads account he keeps insisting that he d…
ytc_Ugy0tlO-F…
G
AI and robots need to pay taxes and social security. Then I will ok will the c…
ytr_UgwI6LSTo…
G
Its alarming how many people are exactly like this AI. I cant tell if they taugh…
ytc_Ugx7TR8vF…
G
We should start spreading the message that “Tesla forces you to disengage full s…
ytc_Ugx2ZzOYI…
G
because you have to give ai a prompt for it to do anything at all, it still cann…
ytc_UgzjinUEG…
G
And everything went really well, seemingly intelligent older guy, with lots of e…
ytc_UgzB14j7P…
Comment
19:00 & onward - Reminds me of an Asimov story in which an amped-up robot brain (called Brain as i recall) not only solved the problem of faster than light travel which had stymied human physicists but designed & built with robot labor a prototype spaceship. This was a benign super-intelligence; Asimov was a technological optimist; i am decidedly not, but as his heroine Susan Calvin might have said: may the best mind win...
youtube
Cross-Cultural
2025-10-16T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwz3_7XpPu2Yf6dfnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyrrxumlnhL5Et4SLN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz1qaZhq_jqPeDX6hh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZ5RngF3jm0oPv5CN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyaMrKa9T6Qvfvadm14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwouqr17U7yMbQVhwB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBipNrEgxVJ9aMVNl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtHdxyxqbUpBn9U6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzos_0LOJzxX7BaMk94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxXoHOU3vkifYacdfF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]