Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI was supposed to make our lives easier and help us but instead it just took ov…
ytc_UgwqVmzzb…
G
Once AI reaches a stage where it’s fittable to robots then it’s game over for pr…
ytc_UgyG9EXaf…
G
They’ll give jobs “supervising” A.I. to the “right people,” and the rest will st…
ytc_UgxbEpqsj…
G
It's such a nothingburger of a technology only meant to fuck up access to water,…
ytc_UgxxXYATQ…
G
And also the idea of “does it have experiences” and the answer being this incred…
ytr_UgykcxymM…
G
I use AI to write fiction. It is writing so well. No matter who whines, I will c…
ytc_Ugy5SVQKz…
G
Absolutely! Wisdom goes beyond knowledge; it involves understanding and applying…
ytr_UgxWSqKiE…
G
I would like to work in one of those vans but by the time your hired your alread…
ytc_Ugx55JE9A…
Comment
Eventually people will have implanted brain chips and then AI will be doing our thinking, so to speak. I read that AI would decide what it computes re: our decisions of what we need and want.
youtube
AI Moral Status
2025-03-31T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxX0x-dvAWr_VsyUSl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxUQAlUHlYI-j-R5N54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9djmB_b973QlySmN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwk3g10k7jeSlRsIjF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw66dGbCfV7j5N0qOJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxecr3tu7ApyYH_MR54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_uns5wgE5BAFpoeF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxcZOMD3kjK45AOPut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQNjvigABa8UjuiZ14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx_tfBMSuyG-Huplz14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]