Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can’t automate a mother’s touch or bond with her child, so I expect anything…
ytc_Ugx-437My…
G
"Why Artists are Fed Up with AI Art." That's you. Why you're fed up with AI art.…
ytc_Ugyz6iVA7…
G
Yeah I know how to code I use ai for boiler plate I do end up fixing and always …
ytc_Ugx2t8VDw…
G
HUMAN S DON T OPERATE ON THE BRAIN SYSTEM------THEY OPERATE ON CASH------IF YA G…
ytc_Ugw4i6dRE…
G
I just love a guy who calls any criticism of AI theft disingenuous and simultane…
ytc_UgzSu1-d4…
G
The sooner corporate America learns that the only thing AI should replace is the…
ytc_UgyFZX0ED…
G
If you were a 35 year old mechanic today and did the same process, would you be …
ytr_UgwR9A7_T…
G
AI stans are the definition of "zero effort" when trying to achieve something. T…
ytc_UgzlFvn3N…
Comment
In 2030 -2040 there will be robot doctors but people don’t like them at all and they actually cannot compete with the real ones so they will be given sone sub duties like public help in disasters or sty like that. Think this as an example. Agi in physical world will not as good as it. Will be in digital world. But I advice you to learn some agriculture. May god bless good people and our children too from these evil plans.
youtube
AI Governance
2026-04-19T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyi3pLC5TNteZKtRsl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzY9SpYSNuvKnmiXl14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzjVGRISnbQ_h1BUUh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxnMiaSePTqEyeXPkF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyz-0zwbh8jKD8ICFx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzcn5ZLLA56giLLTdx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwo0MsgyhSbFrGE10x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz_ZsvN615IytJOhBV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJfAQotUfALcfOmcN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrwuDyEMGhQZVtBhd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]