Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
lol, i have a great idea. put something that looks like a waymo device on your …
ytc_UgyrFWhIq…
G
You're absolutely right! AI, like Sophia in the video, is always learning and gr…
ytr_UgxI52gTd…
G
Was it intentional for the "AI bro"s framing to be white and unsaturated while t…
ytc_Ugymaszqd…
G
This guy talks shit, impossible to have a brain and not realise AI was going to …
ytc_UgyxtFViU…
G
I don't know what you're using, but i am pretty sure either you're prompts suck …
ytc_UgxzFm4UT…
G
this would make george washington and thomas jefferson shed tears. when they wan…
ytc_UgyRM0s_1…
G
I work in lowcode automation and a lot of my colleagues are very worried about A…
ytc_UgygvbwzR…
G
What I want from AI: A virtual ASSISTANT (not worker) that can help with organiz…
ytc_Ugz6PIEPv…
Comment
Might be a better idea to put some thought and money into diplomacy, into building humanitarian bridges with say, China, rather than leaping into this "we gotta build a superior robotic military capability before they do" philosophy. This latter idea is the one that you appear to be embracing, largely, it seems on the concept of preemption. That viewpoint seems more aligned with the dark side of humanoid robot exploitation that it does with humanitarian morality.
youtube
AI Moral Status
2026-04-03T23:5…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyD39jyb34Camv1WEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaQtbVGrGKfxQ5aLl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzENmiTpDjE6m7oCmt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybqbcJXB-5-MDFE594AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgygcsXhWmSoawe-quN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwsKOagYbBaP9yU11N4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxz42wGjGl-rhvGqY94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzjWRdi_1sxrBYmI1N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz0l7yz065oLJFBswZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjE0muEcTiNC1cVhZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]