Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hot take, AI ‘artists’ shouldn’t be called that, they should just be called “AI …
ytc_Ugwx2ccA8…
G
It's all a huge fallacy, and SUPER dangerous, why we should not be going down th…
ytc_UgySlb0X3…
G
DOD is already using anthropic. All of these AI companies are run by lunatics wi…
rdc_o85qstp
G
What happens when this "healthy fear"of Ai wears off in five years after its bee…
ytc_UgykdObeh…
G
It’s like Europe pretending they don’t see the Germans preparing for war. Peop…
ytc_UgzwhW_9I…
G
Okay so why not have an AI create the art and then in your final product change …
ytc_UgwDaErxL…
G
The title of this video alone makes me sick to my stomach. I'm not watching this…
ytc_UgwWZWaEV…
G
Capitalism is doing the exact opposite of what machines and AI are supposed to d…
ytc_UgzyYI7oc…
Comment
First off AI Safety Research is a real international thing. And governments can’t be relied upon to regulate such advanced technology. We need regulatory agencies being the actual developers. Not the existing agencies. It’s a whole new thing that requires an all new agency and all new regulations. I’m sure you agree. Even though you want to make intelligent robots that number one to one to humans. Second off, why are you talking to TC at all? Oh, it’s because you agree with his politics. Ok, now we all know.
youtube
AI Governance
2023-04-21T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhHaYYQKbdcXRjjSd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3S_D0PjBXG_jIupx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTM8inCYHBmTIEFMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwK1t9X5dMdZsqWXPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyiWrJAjBcLltVPvGd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxixiHztrqsiPKLvqN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXr-JxpyKip_RFfdh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw2o41o3Dqrd69HAHh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxf5O9YdP_RmKNmoIt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy78PPgLZ0Yy-cMbO54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]