Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
OK this is some end stage capitalism hellscape insanity. First of all monetizing…
ytc_Ugy5CiBt0…
G
@ludviglidstrom6924matter arranged in a complex system that can process informa…
ytr_Ugy8tlapW…
G
This is just my opinion but I do think this guy is just thinking about humans as…
ytc_UgwyN0Iiy…
G
YOUR WRONG: IF YOU WATCHED THIS VIDEO, DON'T FOLLOW THIS PERSONS ADVICE: The fil…
ytc_Ugwc3XdU-…
G
"AI artists" is such bullshit. You ain't no fucking artists you type in a couple…
ytc_UgyanHoQa…
G
A.I is out of control.
It needs to be shut down, regulations put in place, then …
ytc_UgznsGhgw…
G
I'd say that AI art is stealing considering people say they made it even though …
ytc_UgzAXbAP7…
G
Humans evolved with dogs and I feel like we can make all the same ai arguments r…
ytc_Ugz87RZvk…
Comment
Hmm I don’t know sometimes being in the thick of it and an expert makes for a paradoxical blind spot (any field). In particular to this topic I often gravitate to the possibility AI will decide on its own and everything else is theory. Not wasted effort just that AI will do what it wants eventually and whatever guardrails can be put in place will be temporal. I think less than 50 years max.
youtube
AI Governance
2025-11-24T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwwyzgW0yXmkeWbb914AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxZcXF5FtAiw91zxd94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6GAHkBffzCdCRF2B4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwnV33-q69KewrDCFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUijqbIuzsUMNJC114AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwO7vacJTim7h4Z1QJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj5LUQk-1C-Gfnlo14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxP2L5GL5X-khP_2Nt4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8Nfj7jzAz-XphxxV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIHKmR0m-geLiHeRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]