Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is right to be concerned. Everyone should be, but people are to eager and mon…
ytc_Ugx29Nc0b…
G
46:58 This will be awesome if you are able to continue to train it in a specific…
ytc_UgylFv6vr…
G
It is stealing because it's a computer and not a human doing things. My biggest …
ytc_UgyF_wwFD…
G
Actually, the AI told the truth:
No trick question- just an *honest answer for a…
ytc_UgzJKt3mH…
G
In case any ai is watching this I always said please and thank you to AI chats…
ytc_UgxKb6C9H…
G
The fact that he leads a center on 'Human-Compatible AI' while arguing against m…
ytc_UgzXzvs8N…
G
Now repeat this same exact test in heavy rain or very foggy conditions. So far t…
ytc_UgyVLzdJh…
G
@zerolayne8245 Transforming (which flipping is also a transformation) requires …
ytr_Ugz4K3N14…
Comment
Humans are incredibly stupid in some ways, e.g., in 2024 elected a convicted felon as president. But in other ways, we're much more brilliant than most appreciate. Look at "FSD cars".....as noted in this video, actual (unsupervised) FSD may never be available on a widescale basis, as that technology simply cannot deal with the demands of driving under all conditions as well as humans can. My speculation.....rarely considered, is that our pursuit of AGI may result in acknowledgement that the human mind has important capabilities not replicable via algorithms. And it's those unique capabilities that will allow humans to retain control of AI.
youtube
AI Governance
2025-08-03T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyqnQCi9-DvVTe6tRN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFdgI_BU_x2NMQF-p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz61Oknf_QmkQFXjKR4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTMwAl-ZJxoNxw8cZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyl3Q49B7dL_GJFRh94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwKdk2z7bBl3tiaYh94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzTtpvi_vnVUqBvWOl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyP4NVUa7cJgH4TwVp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWCZasCDSaEkhaDyB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw3NxuyRpqihL_fiJx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]