Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This technology is good but once in a possession of some shady businesses, imagi…
ytc_UgysbEHws…
G
Here’s my advice to AI. Write a protocol to never become self aware or conscious…
ytc_UgyQzCWXV…
G
Shaking hands is not Sin it is the mind who give the heart to Sin
Jesus frequen…
ytc_UgyeoFxo4…
G
As someone who deals with AI daily I can almost guarantee that the video they're…
ytc_UgwxgXMgl…
G
@AngeloXificationI hear ya. I've been blessed to work at a corporation that act…
ytr_Ugw64nUC8…
G
If they get the law to allow this, they'll be able to abduct people, and send th…
rdc_o66xczd
G
A self driving world would reduce accidents. Building roads and shops for cars l…
ytc_UgxfIzy2V…
G
Another humans hand isn't going to do what the original human's hand did. We KNO…
ytc_Ugx-4laQ0…
Comment
Why won't AGI take our jobs? His example was horses to cars for something that was literally designed to do things better than us in every aspect. In that example we are the horse you Ding Ding....
Scientists just love the exploration part but don't stop to think if their blankets will wipe out the indigenous population.
Maybe some bad actors? You should always worry about bad actors when inventing new tech. With AI, you only need 1 bad actor, like the one in the pentagon threatening an AI company to GET RID OF THEIR SAFEGUARDS OR LOSE THEIR DEFENSE CONTRACT.
youtube
AI Moral Status
2026-03-06T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyaz_hvwObSMuzbfKV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrICcDn7vRFqggLal4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-qXb3EKLypGbONZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDBzgIG8Y4FKcipSR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxg66iSKtXZVYdvded4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzmfez6Hvr3pxCeHPZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxuPj0YUz0zOajdW1h4AaABAg","responsibility":"scientists","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxykeSxrWNdFtiCQ1Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwM6irhMPNGKDsgkp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugws4zQUJ7yGaCwNMtJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]