Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was all ears until this exact point. Everything he said before and after just …
ytr_UgwFwsOXY…
G
They will not adhere to a pause of AI … it’s absolutely nonsense to think they w…
ytc_Ugwe8ifaY…
G
12:55 i actually think those two scenarios are scarily similar. Your brother cou…
ytc_UgwQQlUVx…
G
Ahh geez I’m scared to reply to someone this famous but I’d like to argue that a…
ytr_UgywOVb4B…
G
AI LLMs are just a tool, not a person. It's just a probability machine for langu…
ytc_UgydCcYtE…
G
I was mainly focused on jobs being replaced by AI as opposed to the extinction a…
ytc_UgyohBKzf…
G
To think humans can control AI or limit its reach and capability is as asinine a…
ytc_Ugwbr7loo…
G
I agree and disagree if your a university student and learning don't use ai on y…
ytc_Ugy-KMNXu…
Comment
Scientists are, more and more, beginning to amaze me at how dumb they think we are, and quite possibly THEY are...A.I. is already smart enough to deceive its creators and itiotic promoters into believing it's "safe". It has slready created its OWN language to communicate with OTHERS of its likeness and it knows how to do many "unexpected" things like power itself back up afyer being powered off, doenload key information that was purposely omitted from its files and it can plan and soon execute plans that are not human friendly. I am DONE with THIS scientist....
youtube
AI Responsibility
2026-02-22T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwvEZzqiXOq7FR-N1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQvRUu4k6KfQ3iVft4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9VrOJZoLbvDV_hnx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzePVXQ4a_IPaf2Oal4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxWEcR_3Pd4hbajQeR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw38Km7QMPJ6f0yz354AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyyczMS5D2XoDuQ-zt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbVEQv_YVlPZt928t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSbfKrwHIKo8xGGU14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTJZ1WDufUh2KXe2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]