Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
look at i robot and terminator combine....
i think sonny is the one that say im …
ytc_Ugyd-esc9…
G
And the sad part is, that he could have gained the skills. All this defending of…
ytr_Ugx-Ms4up…
G
I wasn't sure about AI a few months ago - but after playing around with it - and…
ytc_Ugx86yckD…
G
I recognized it as AI, but not because it looked fake, but because it looked lik…
ytc_Ugz6s6L9g…
G
How is this a loss for AI, it makes it much easier for AI users to use AI images…
ytc_Ugy_Xz1uR…
G
Here is a conversation i made with chat gpt with these rules
Rule number 1:only …
ytc_UgwwGf2_C…
G
If you book a driverless taxi in 2026 then you are completely stupid in every si…
ytc_UgwK1eZoE…
G
That's truly horrible. People should not be able to just steal your artwork if y…
ytr_UgxD4id27…
Comment
People are smart and at the same time people are so damn stupid. It’s like those super genius types who have no idea how to talk to another person on a human level. These companies are at the moment living out every horror movie we have seen involving AI. The many number of possibilities of the human destruction is way too numerous.
youtube
AI Governance
2024-01-18T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyreUEyy6bDsfa3NdN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxaWRVCOllZYcVvmQN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgywanQgjQKLMLrS0H14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7Sh_rbDPPSl0o6Md4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzLhHF6ZDYG1ICN7S54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwJZwpUcLpHC-C0x1l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwLdo5ox4NbEUb8qXV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyg85VDYc8ZQCvtqHt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx7-bNdT3R1vS6cZoF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGqMRtgXoqtGIqKUV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"resignation"}
]