Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Scare all the simple minded can viewers. Beware of your new AI dictator. You sh…
ytc_UgyczmuGs…
G
Thanks for bringing us this thought provoking conversation.
I've been it IT al…
ytc_Ugx4jTXNo…
G
It all depends on how you open the topic to it. It's insane. If you ask AI "is t…
ytr_UgyWK3ZgD…
G
Liability example: How does AI address potential liability concerns when AI is u…
ytc_Ugyd20kR2…
G
With the recent news of EA's new owners looking to replace developers with AI, y…
ytc_UgzUJ832D…
G
Have you ever used chatgpt? They were right though about their concerns with mis…
ytr_Ugw664Rx6…
G
I guess I just don’t get it. He admits it’s AI generated then who cares. If he t…
ytc_UgyaRaS5K…
G
So is Deezer saying all AI music is fraud and thus should be demonetized? I ask,…
ytc_Ugy4QOgaP…
Comment
These AIs are not AGIs though.
And to be honest, is 'preferring to cause harm as opposed to failure' not a human trait as well? Self preservation is not unexpected, but these AIs are not capable of swimming outside of pools they're made to swim in, hence AI and not AGI. A lot of these situations are designed specifically to gauge results like these.
youtube
AI Harm Incident
2025-07-27T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy1gyAq20501eRaJhJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw_7F6O1B6aCwuQ6Ul4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZ2sfAereaJS2h3-h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0iHVORyWS7RFHoON4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1EMPj2vxmcPUCldh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxg9e_NB1ri2cqdbuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxWU72pJSlyVkGa0gN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwxf3qzAh7JO2yD_g14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwQnKVcip-kmIhNW5J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxlj1ZddQiYEzSUJq14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}
]