Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While the is more about existing welfare than the UBI, the one point missed by t…
ytc_UgzyBM4Hw…
G
My main problem with this kind of ai is why would you want it to do it?
Out of a…
ytc_Ugy_pGg_P…
G
NO BASIC INCOME IS COMING. AI AND ROBOTS WILL REPLACE HUMAN JOBS, LINE THE POCKE…
ytc_UgzJu7vfF…
G
i know how they can really get you! they can make ai art based on yours, and the…
ytc_Ugx_N3Y1C…
G
These will be used in the gaps in auto sort systems. Work stations where humans…
ytr_UgwcOa5f5…
G
Just like sometimes fiction seems more credible than reality, because reality do…
ytc_Ugxaq5imj…
G
What if AI art is just companies knowing it’ll get them criticized for them to g…
ytc_UgyVy8C2a…
G
I'm studying to get my paralegal certification right now and attempted to try an…
ytc_UgxiQZIpA…
Comment
For me, the big problem with AI or at lest the way its implemented is that it never accepts that its possible and ok to be wrong now and again. People in general, fully understand that they can be wrong and to a lesser extent will correct themself in the future. That never happens with AI.
youtube
AI Moral Status
2025-08-28T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDz0Op1YtXU_OmSRd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxW95hUyR3-aJpjvkl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgygLh_Mw81ph_Pvrex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhqLHiGCZDyeeCB2l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwtMMjeoLM_rHl09q14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwTxlvM1BXHFvB8x214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxL3xftZalw2Q9QQF14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-Su9EtHAgbllTXqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxnWf7NeV3n4mFfMQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwoQfOU1XlY_79aMO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]