Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even without the broader context, the actor’s facial cues alone suggest a positi…
ytc_UgwqfGtbF…
G
The problem with exponential growth, as it pertains to AI, is that you have to h…
ytc_Ugz27CdL9…
G
to comment on the comment about "what's wrong with letting ai do the work for hu…
ytc_Ugy7a2Lx_…
G
@simple1818that's not even true. Claims about AI are overblown and overra…
ytr_UgzlJ1jdy…
G
At a time when water shortages due to climate change are predicted to be the bi…
ytc_Ugw9DQwcq…
G
not to be rude but I don't think that's their point there ask "did the A.I make …
ytr_Ugzw_s1ht…
G
Varun got his thinking wrong 😂he is too liberal too see why these jobs will be f…
ytc_UgwMhoUE5…
G
Humans: "AI we want you to find a way to solve word hunger"
AI: *kills all huma…
ytc_UgzHruHXq…
Comment
WoW.... None of the AI regulations apply to military usage of AI". This is only partly correct, because International Humanitarian Law (IHL) regulates AI, The Department of Defense Directive 3000.09 regulates AI in the United States.
youtube
AI Governance
2025-06-16T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwOLEslm-y4WZeywlJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyzFZ_kcuu_uBlhzQp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyjuU6BnerDG8X1tKt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1Z88k3P6vV2pFp5N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgxATvcDObYs1rVb_dJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyScV6Ts1PfROxpB1h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3XFQk7lg3FIcTwYJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxXltq_3BqssUxUQJJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6_7DDFucvo-z4Qk94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwZVGnOLRPbqzFF6jh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]