Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I gotta say, I'm not impressed with the advancements that have been made in life…
ytc_UgwtVYzgK…
G
AI is our future. There is no avoiding it. I am curious to see where the PaLM p…
ytc_Ugxo3dpag…
G
I’m an example of the type of disabled people that AI bros use as a prop in thei…
ytc_Ugx5Alw-g…
G
I'll be honest when I say that from my experience using AI, it makes me feel pow…
ytc_Ugxz3U7EP…
G
stop using ai you fools , the planets gotten so much hotter the past couple year…
ytc_Ugy30sBvl…
G
This is the robot video we've all been waiting for, like humans circling around …
ytc_UgzVwhgwJ…
G
Coders remind me of designers and artists back when AI was starting out, they al…
ytc_UgwlF2h72…
G
Youre trying to create a robot that can learn
HAVE YOU SEEN TERMINATOR
WE GOT TO…
ytc_UgiOlp03P…
Comment
If AI were to try to solve the many major problems humanity has created, it would quickly realize that all other species live their lives within relatively small, natural cycles.
From a purely logical perspective, the simplest solution might not be to fix those problems, but to eliminate the one species responsible for causing so much harm.
And if the AI were merciful, it might choose not to destroy us entirely, but instead force us to live with the same small ecological footprint as every other species.
youtube
AI Governance
2025-11-07T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxrQQ7u4xRx0nz4SZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_X9xv90fLq9QvY4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxGAd5Ea9ClHY2Xf894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz2Mzky-osL_Mhlq5d4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxVZlBooyGhOpAxxZt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZYCfqXudsjEAaLtV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgymL8xEE2s_zyf0Vy94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAZU4G0vlw4_fI0LR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxX2EFNVbdVydXD4vV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRSDw-kLe-FBoglOd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}
]