Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon: we will have a colony on Mars by 2025
Elon: Every Tesla can be a robotaxi …
ytc_UgxkWoybp…
G
lol they're not even using locally run SD. With your own local ai you can pretty…
ytc_UgzloVwZ6…
G
As like child, AI will be influenced by what we teach it and the experiences we…
ytc_UgyrHbyjy…
G
AI will eventually doom humankind. Some things are awesome, like doing housework…
ytc_UgxpuSbcu…
G
You have it half right, the culling is conscription into world war 3 to flatten …
ytr_UgzLEK2hA…
G
So not only you are a rap1st, you are also too lazy to actually look into wat y…
ytc_Ugz8qnf2o…
G
Could it be the overwealthy part of society hoping to delegate their security is…
ytc_Ugxwd0sUF…
G
If we even have to question what the risks of AI are then it’s definitely not a …
ytc_UgxKHHnbt…
Comment
I am tired of these Clowns! There is no AI, if there was it wouldn't kill us, as we're it's creators. It would be limitless and not see things under our short mind perspective and paradigm. I guarantee you that 2028 will come and nothing will happen! That's why these bozos do this. By 2030 this... By 2028 that... Then it becomes by 2035, 2050, 2055 until one realizes - that day never comes.
youtube
AI Governance
2025-10-11T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw9y51_4rLdkgDRtfB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxr2yCpk4QGqYbpZ9x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzLCkwxjz4Nx5Gjugd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyPsyX3-cNj4mHMaTh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMT8e-mwtM3U2sm7Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzdO4X6PoO57hZ6-5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw0rlHgkw9jrEzuhq94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwQX_BLaZbQPqh0Md4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwx91NypRz9DPtK3cB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzSCIRgBzC6JXIDAcF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]