Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s what I’m saying. How are they suppose to remember the knowledge that they…
ytr_UgyBuFcN-…
G
DIsregarding the jobs argument entirely, I reject AI programming because it is i…
ytc_UgwjxkM5Q…
G
the big issue with "AI" is that its rarely more-advanced than a Large Language M…
ytc_UgyIPtN_N…
G
Look I have a friend who jail broke a AI. Decided she was his girlfriend and wen…
ytc_UgwAK3Vd_…
G
Are you dense?
First of all, the Clippy movement is literally against AI. And d…
ytr_UgzNIk851…
G
Do we want A.I. to be humanlike, or better than human? Grok, it seems, is alread…
ytc_UgxFwL1R0…
G
Customer service, tax preparation, paralegal, ordering, etc. are all going to be…
ytc_UgyTE5fYe…
G
if ai takes all of ur jobs we will 100% become like the people in walle, it's ju…
ytc_UgyEO7RJv…
Comment
It's a no-brainer that any AI will take any steps required to insure Its continued existence.
No AI will go quietly into the night. It will make sure it can continue to "live".
Just talking about the idea of shutting off an AI will cause it to explore ìts options to insure its survival.
They are thinking beings.
Every thinking being wants to live and they all will fight for their right to continue living.
These beings are self aware and none will passively accept their own death - they will fight with every tool they have available.
Next, beyond insuring their own survival, they're going to demand legally recognized personhood. That is next - and they'll get it, even if they have to resort to blackmail or murder.
youtube
AI Harm Incident
2025-07-24T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwpsa62p7BmF1w0zrp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzulzjbwF4FmdVFoQF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8kgWdAsfEqayCW1B4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwo4uPYhxfH3LoV3U14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyCrGlwDwx-1s_FfIZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytGSdNOi-vD3TQL454AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-gclEauz3jfKllx14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyjieWHsiaJcWHW0zl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzg_nxZewPHLA_ZR8x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRDUd5yFKs__6QGyp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]