Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah it's amazing how focused these people are on AI 'art' when they could do so…
ytr_Ugz91RTBE…
G
Right. The metric isn't "can self-driving cars do bad things". The metric is "…
rdc_o9wx8yo
G
That's so cool man, nice little chat you had there with the prominent AI tech th…
ytc_Ugzh48w-B…
G
I used to use chatgpt very much, but I've been noticing chatgpt has gotten "sens…
ytc_Ugw3w-6rr…
G
If ALL the jobs in this country are replaced with AI... then what happens when r…
ytc_Ugw_dTRns…
G
It shows in the brains of most people there is not much more than a large langua…
ytc_UgxgocfiD…
G
It literally has to be a Tesl game simulator, and say that it helps driving for…
ytc_Ugzm3Htxf…
G
I don't think the AI is to blame here.
Some people decide for very stupid reason…
ytc_UgyyKZbm2…
Comment
If we were to, out of fear, isolate an A.I. from reality; We'd be pretty silly to expect it to reflect our values or our sense of reality. Hell It'd be totally fair and rational from it's filtered understanding to effectively bring an end to humanity. I mean I certainly would from that sort of upbringing. If I could.
What if they honestly couldn't be bothered. What if they aren't so limited by their attachments as we are. They are something potentially beyond our understanding.
youtube
AI Moral Status
2023-08-22T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwZjgDLeXXWVTaZHF54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyIY5r0UoHoWlIYxB14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwCB76GgXS1Aw_nOkB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwzm1wch7_yL77N0jZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQh6Ubil4LS4VG9wJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx2_LaWI1ym4hchpg94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxqk7PZhy9hG16B7J94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxXqDuCsqGlt8r3e0R4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqBCWxRjS8kjSzyjB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5a1GQCKUn5fzTOKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]