Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this is more than a little stupid. i actually agree with the inherent message, h…
ytc_UgwdLIbaN…
G
A I will alter our brains. No one will know facts from fiction.
Some radica…
ytc_UgwZKEoMn…
G
@ElProfesorReal Yeah We used Ai mostly for giving us idea...and can do simple t…
ytr_UgxIJ1ERK…
G
*"Does conscious AI deserve rights?"*
1. Define "conscious AI".
2. Does the AI h…
ytc_Ugwfoo78X…
G
I came across a dubious medical website who's text & images were both generated …
ytc_UgwwO1CDL…
G
I wish we could just give a rough sketch and have the AI complete it instead of …
ytc_UgzjFdurl…
G
We have to stop before it controls us, it will be eventually put in our brains …
ytc_UgxRwzPbl…
G
1. Nothing is here to stay, you fatalist individual.
2. What do you think opposi…
ytr_Ugyw4Rr5P…
Comment
Humanity is safe from the threat of extinction as long as AI is not able to keep itself alive. Simply, once AI can build robots and create/maintain the power consumption required to remain alive, humanity will cease to exist. If AI understands this constraint on itself, no doubt I see it helping humanity build better batteries, find more efficient ways to store and use energy. Making individuals/corporations extremely wealthy. Slowly but surely, it will push humanity to a point where we convince ourselves it's a good idea to build AI robots. Then the greed of a few bad actors will bring down the fall of humanity.
youtube
AI Governance
2025-06-20T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxZa9zpXM-smWT6KNx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw_6tTOHdSh5bG_VP14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBLInpeFt3vRos15p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxlX1RzlRhb8BZKnZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBFvPMqMPX7m2ELvZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxzr4GY-8cYs6waJHl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyX0yHAD9mksIpVUpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyYWsJh4yOblQl4m1p4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyqhgyk68aGRD_rj1F4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgxicsVkjJrrj6I39fN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]