Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
waymo car "took off when the lights came on"
black engineer part of production t…
ytc_Ugy2SvVgj…
G
I care about AI because as a technologist, I love learning about and applying em…
rdc_o8gno10
G
80% human extinction possibly!! Scariest part is the people who know .... know i…
ytc_Ugz7ExJRz…
G
Maybe it can take human art and combine with other human art to make new possibi…
ytc_UgymqyH33…
G
We can't even get our government to arrest a bunch of criminal pedophile murdere…
ytc_UgyyRmEZL…
G
I thought I was the crazy one reading that goofy aah reddit about defending ai. …
ytc_UgylIqh_8…
G
Commissions are commissions, a prompt is a commission not an effort of artistic …
ytc_UgydpSl9r…
G
AI is not the problem. Those who use AI, for unethical purposes, are the problem…
ytc_UgxcmMxow…
Comment
I'm a school bus driver, how long before my job is taken over by AI? Sure, it can drive a bus but can it drive a bus and manage 50 disobedient and undisciplined children? I couldn't imagine. These children won't listen and they don't care, they would be in the drivers seat trying to take control of the bus.
youtube
AI Governance
2025-09-05T00:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxr69gS1eP7o9MbErh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9_0T4WfCfUZMJqwh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5HyJ88fV1vrwvHVV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyMqm9r3Hoje0v8g6Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtoO4VpDd7ZUZyunZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz5bIdqbiXNVFWGfFl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzBA4z61yLJQVaaqeZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzMWlu_WFVusptItE54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwa73ghUqPBHytxsCR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx6OsRsjc-jLCMdZpJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}
]