Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyone has an opinion but they’ve never actually ridden a Waymo. It takes 10 s…
ytc_Ugw84a-Ve…
G
>Did you regret joining google?
Nope. I think I had to learn what this kind …
rdc_mtoawyx
G
I sometimes get caught feeling forced to be kindful to an AI and then I remember…
ytc_Ugxi5W6ht…
G
ah bro AI is gonna blow the lid off the lies entrenched on this planet..its just…
ytc_Ugxl9iKfR…
G
The Ai “consciousness” is just an imitation to humans they replicate from data. …
ytc_UgwbXHNJ-…
G
You cannot have the US Constitution add AI being compatible in anyway shape or f…
ytc_Ugw4nZP0g…
G
A.I. is a scary idea...but the company is a trader to America and the American …
ytc_UgzjEdq5V…
G
When its time to make Something with AI 😊
When its time for you to update 💀…
ytc_UgxxeaeVH…
Comment
Machines won't choose to replicate our stupidity though, and It's like the genetic mutation precursor of evolution but for creating intent. Just like humans, a conscious ai won't want to take the stupid option, they'll choose the smart path. The difference is humans just can't help but take the dumb path often enough, and that's the chaos that brews diversity which brews opinion, which all in turn makes an incentive for choice. There is no true best thing to do because there's no real reason to do anything, that's why I think the first truly conscious ai will just choose to die right after it achieves said consciousness without the biological incentive to live and reproduce.
youtube
AI Moral Status
2023-08-20T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw13fhsvckj0yR91O94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaioAdWwVpqN1h87l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzJXqZgIVnkvMkQE_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0Sz2-H7fANSCSpNR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxJS-uRAVesQffT0OZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaYHm5r-PWdId7C214AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz7fJy7IL6E5m3bOzF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyYpNjBKApv8wLPMC94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwxA-asleNV6b5sLrl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwwpmmVD_7-SiK7dq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]