Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you give "AI' a goal, it will complete the task in the most efficient and eff…
ytc_UgwWyEX4p…
G
+Michael Thomas Walking is slower than driving, unless you are in a traffic jam …
ytr_Uggm5Bdzw…
G
I don't care how quote unquote impressive it is that your AI slop is indistingui…
ytc_UgzeFsd4o…
G
AI "artists" are as much "artists" as i am a chef when i reheat food in the micr…
ytc_Ugw7FnIyt…
G
When you talked about things than we might be better than AI, I have an idea: if…
ytc_UgzkhDZSI…
G
most ai images arent that free, if y ou want to make better ones you needa pay a…
ytr_Ugy1sEZxI…
G
There are a lot of legit articles out there these days. A professor at the Unive…
rdc_fctc77k
G
People I'm telling you AI is BAD I MEAN BAD WE HUMANS ARE IN TROUBLE WHY BUILD T…
ytc_Ugy8d-h0O…
Comment
I believe AI is dangerous in the wrong hands which it will be if it isn’t already. Apart from that, why is every topic discussed in DOAC so bleak? Shouldn’t there be a responsibility to present a counter argument to such a bleak present and future of does that not exist?
youtube
AI Governance
2026-03-12T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxdx5rV8DGQFlmGbx54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSiOl6goEqLM_2gkt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUorKqXDxRnkSMQmd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOPNkphegAH9jzjwt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYAXhzMfmYdO5FmZF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyiiwlthXyUcwkWjst4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqHnO8Ei_InhDUoMB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTJ4jSebFt20BJgAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymYX2rYfvOblR_Tkl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9RYZg_lpa19SbfCJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]