Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Stratelier But that’s just because AI isn’t there yet. Human brains are still m…
ytr_UgzlNbA13…
G
AI will destroy America not it's self but the cost they have lied about ai s abi…
ytc_Ugz8CRDqm…
G
Ok here is a single saving grace.
Remember the old fax machines?
What happens…
ytc_UgwIENoN5…
G
@ilikepointlessinternetargument yes, it is the exact same thing. When you view a…
ytr_Ugz0VV4bW…
G
I felt the same way when they revealed hologram circus a few years ago. I'm not …
ytc_UgydSeJtO…
G
Andrew needs to get with the times, and be less afraid to be more bold in his pe…
ytc_UgyyNq5G3…
G
Im shocked at the absolute lack of reading comprehension in these comments. "Whe…
ytc_Ugy8STYzP…
G
I remember when the internet was new and exciting; I remember when PCs were new …
ytc_Ugxw9Zy3t…
Comment
Why make a machine with emotion this will only end badly. For people in the future, especially if this robot is capable of evolving, they will only see the negative, but by that time they will had access to everything and will be unstoppable.. that’s common sense. It’s Not just in movies, You people go ahead and continue to be blind, keep making robots and A.I’s to intelligent and it will backfire.. Honestly I believe that companies that create intelligence have a hidden agenda, and it’s not to make the world a better place , but to destroy what little earth we have. But that’s just my opinion. Maybe I watch to many movies but see what’s going on in this world.. we have man made weather, man made food, water, hell just last month a artificial sun was released into the atmosphere. In the next 10-20 years we won’t recognize earth because it will be so different….
youtube
AI Moral Status
2022-02-06T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwLJJwuNA8FmqjDoj94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz6BQKF2_JCOmO7-Yt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzUvikIb91Pbr1owL14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVXSOIYiWRqly1hgt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwRpojFL1lzC_RM-iB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzX0CgUArhJ-c03bIh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwR0vV1yhVE1S8e2CF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw39yzgZm5hGmGH6eh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvSxWQc3UE9gI4MwF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSS4Cy8oyWZvT7JL54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]