Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an artist, I support the idea of AI being used as a tool to progress things l…
ytc_UgyXheqYG…
G
I usually wish I was younger but given what AI has done to new college grads, I’…
ytc_UgxlSv3bY…
G
Not that the AI doesn’t need fixing, but if guns were actually restricted, likel…
ytc_Ugy6GZKBU…
G
I feel like the question in the beginning is odd, in my vision of future AI at l…
ytc_Uggy6hX90…
G
Probably going to be roasted for this, but I see AI as nothing more than a tool.…
ytc_UgxYEWS4u…
G
8:44 AI don't seem so bad all of a sudden. Now I see why the narrative changed😂…
ytc_UgzBM7do0…
G
But they haven't said anything about it, or done any layoffs. So no you're wrong…
rdc_ksku41v
G
it's MegaHAL (chat bot parody of HAL from 2001: A Space Odyssey) from the late 9…
ytr_UgwJWylGg…
Comment
On the one hand, people are all about convenience, and if there is a way to do something easier, they will. But in doing so, they also give up control. One other hand, there are people who refuse to give up that control and won't use a service/technology because they don't value it. (one example: self driving cars). For me, I have zero interest, even if the vehicle had the capability.
youtube
AI Governance
2025-09-05T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy63ZKCbWfbqASu9NR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwy5a5IX2BvVAhIcCl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy5AOyMamVIiVlBSiZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2uGdEZlaXKlRo7GN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugyye3Dp292FpsS63MR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxWzuaWdA3u_4r1Xx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyJgsfbYOoqOofUIEF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyj7aGqzAs55OvhKZV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxAlIGkUTxQLZkqfl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyko0x_KV2PnQzeeJ94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]