Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wish these dudes would quit saying "we" like they speak for all of us. I know…
ytc_UgwcyAHmy…
G
AI Has been around about 30 years, the processors are faster than ever. The elit…
ytc_Ugx51K3o6…
G
Elon was actually a part of an international org like this until billionaires st…
ytr_Ugxxy_KXE…
G
I'm extremely disabled.
I used to play soccer, draw, paint, and play music all…
ytc_UgyTEoy0u…
G
@TrakinTechEnglishSo no A.I at all then for you, or is there something you use …
ytr_Ugzi2z4J9…
G
AI is the true definition of the “you could’ve answered that in 3 words but did …
ytc_Ugz3Tsrn4…
G
@wisemage0 Because when people learn from art they put their thought, humanity,…
ytr_Ugy2FcyJ4…
G
AI will no doubt revolutionize the world and will need to be managed wisely. How…
ytc_UgxfblMZm…
Comment
This is my idea of dangerous AI: A robot ala the "I Robot" schematic is not even "evil" but malfunctions ever so slightly. Accidentally, it hits hits and knocksover a baby carriage and then walks over the child. The same goes for those driverless cars. If you value human life do not put it at risk for "fun", convenience, or the next new "thing". convenience
youtube
AI Governance
2026-01-12T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxmiU6lqG8uBYsIZWh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKXrDbVOH5TYs0gYl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0T_GonwaZ8l5LYUB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugyk91XFkUkej2F1lB14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy9V9c_Tm_BWfXq-CZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwp_RCj5imMLEEgmlR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEyO3YEeGfe13fR_F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyZJ2cGxCHqETHA5J94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwo16b_x29d3GIdNpp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwq-qqnHNPibVZEpjJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]