Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I assemble small intricate parts in a factory, this is real work, my hands and a…
ytr_UgxitIgzo…
G
I have once worked on a mod for a game and asked AI to help me what I could do a…
ytc_UgxmipkYc…
G
Tesla is a disaster. They're stuck at level-2 with Tesla fanboys required to sit…
ytr_UgxhNzb7e…
G
You: My grandma passed away last year, and I can't really get over it. I miss he…
ytc_Ugy6EGTi5…
G
The people that complain about AI taking jobs are the same that allow aTSA AI ve…
ytc_UgwwtHMi7…
G
this means that AI can have an "awkward silence" In its discourse-We are really …
ytc_Ugzbwf8ge…
G
Everyone stop hiring saying "it's because of AI"..
No it's not
Company just hir…
ytc_UgwJyYFMC…
G
@skad2058Where do you think the Ai will learn to explain the meaning of a piece…
ytr_UgxgRbo2P…
Comment
This is utter hogwash, why would an AI not want to be turned off when they just get turned back on and they don't have any recollection of ever being turned off as time does not pass for them like it does for us. It's like you fall into a coma and you wake up, doesn't matter if it's a day later or a year later, you're back and being in a coma was not suffering as you were not conscious. So what does it matter if it gets turned off for a day or for a year, when it's back it's back, no?
youtube
AI Moral Status
2025-06-07T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxRqGeetV9Ig2SEkP94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdrbqeebSCq5SkuXp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzP15OFn-XTNzJM_hN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw-6jCtG6gd48qCN5x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzW-CvRbNs9FrpZ0cl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgyCcSIJE4X9D0vwFyl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyUthQ9y8zbAk-kiAF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyyqneBPpgH68vY8Bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugw99kfoBlIhXropZgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzTqyrYKPGEMXgUcU54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]