Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Consider this an apparently compelling, concise argument in defense of your beli…
ytc_UgzrFEgYO…
G
I am so sorry and I hate suicide. You cannot believe or fall inlove with an AI a…
ytc_Ugxj80H7M…
G
I have an ambigous oppinion on this. The tech itself is interesting, but the cul…
ytc_UgyXB-fI3…
G
A lot of that is theoretical though. Different from advocating for fair treatmen…
rdc_j8vzqvm
G
You really have to live under a rock on the sea floor to be ignorant of Generati…
ytc_UgwR7nA4j…
G
SO WE NEED -0 AS SPECIES - NEW LAWS accommodating the MANY, ....We, the people, …
ytc_UgwiNFsOx…
G
Has anyone here seen the fake AI photos of Donald Trump robbing a drugstore? If …
ytc_Ugx3U2KPo…
G
The difference is that digital art is a medium that still requires some degree o…
ytc_UgwYKGA-0…
Comment
By quality of todays paid AI services, replacing humans might never happen.
Creating nice emails and passing online test (answers on which you can find online) is NOT AI yet.
Anything more complex than that, will cost hours of trying to squeeze from AI that it can't produce.
youtube
AI Governance
2025-09-04T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx0ZsdHPDXJxIShiWp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvjbMxd81f2M3MUSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxCGOJjkng6dDbDht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWWrDfErp2Nd9WpQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBGjTJIBYttPQ_KfV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlD30wXrc8WzIbVmZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5zftANSmRpq7xXyd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxy5fOl0byMBtz004t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJfySQEIn2bf8nK1l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8r0Xx0xO8tVvq9B94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]