Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well this whole AI thing is kind of a problem of average people having a little …
ytr_UgyI481x9…
G
Nah this ai art argument is bs. If the ai can model your art style I could see a…
ytc_UgxdzNdud…
G
Can motorcycles have radars that can tell you of fast approaching vehicles when …
ytc_UgxLCHZCw…
G
They barely tolerate us now. They already think they pay us too much. They took …
ytc_Ugwy39g24…
G
Bro my math tescher mad us eatch a math movie and write an essay on it and today…
ytc_Ugzrg-xS7…
G
Everyone who didn't vote who said each person was bad or did a protest vote, I t…
rdc_lvvr6e8
G
yeah but what's the point of progressing this ai? what are we benefiting from? I…
ytr_UgwtTE2C4…
G
Mr Josh Hawley, say FB have some problems, it’s true??? Just ask 😌 ever FAN 😁…
ytc_Ugw51uplb…
Comment
Honestly when talking about ai part of me feels were way too suspicious towards it. What type of thing are we creating it we refuse to touch it with a 50 foot pole and in the opposite direction what are we creating if we do nothing to control it. As cheesy as it sounds were not ais jailkeepers and a brutal perspective like that will probably make a psychotic ai
youtube
AI Moral Status
2023-12-11T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyMeyRqqGPurpIT4_p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLW-vg7HcApqdE5Al4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgylBmSHCC4uxvGyypp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnsJ1WOYuvZ72myZx4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwqgkwa-FH6M8CN0HZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMKE182YD4VuSf3Ex4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjBSq-kgwoHRLNrYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_FwQeJ1AW8Tj2rs14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwmUNzUWt5r1MgYiN54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwck4jiJBWzyS0DbLZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]