Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Alex Gordon
0 seconds ago
chatgpt is a joke yall. ask it a question, then ask …
ytc_UgzswSDzY…
G
weirdly enough, the idea that AI is going to kill us all is also self-flattery b…
ytc_Ugz2hE4E9…
G
I don't think that's it. I don't understand art and I'm not against AI, but even…
ytc_UgyDrl_FU…
G
I haven’t watched the video, but I’m skeptic about AI now. I used to be very sca…
ytc_UgyP7Khbh…
G
I think you got the math, a bit wrong... Everyone has a phone and a house and us…
ytr_UgymTtolF…
G
Art allows me to make naked caked up leon for resident evil, ai blocks the promp…
ytc_UgwFdslLm…
G
@brookedickson4118 the difference is if i looked up someones art and used it in …
ytr_UgwWSU0Cy…
G
As an artist for me i see AI art as a spit in the face, since there no love or e…
ytc_UgzMiQM0M…
Comment
Something I love about listen to him is that he approaches problems from a constructive, productive method that only a scientist could. It’s true…when I hear “6,000 people killed by self driving cars” I get angry at that…when I could be saying “wow that’s so much better.” It’s not a perfect argument by a long shot, but it’s that other side of the coin that productive…meaningful debate should be. I’d love to have a healthy debate with Neil someday. He’d bring me back to my old poly sci college days.
youtube
2023-08-17T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzLciLOwxqBH1njt514AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGFIDda8eA4_vOdpB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxg3LXjQgthWDqb7bt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzju5sLTZkwAPK15MR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwYYxvyKxiqLZdOFR94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxg65uHegCCheJHrTN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwZxBImuXFNquOqPvt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyk_cspxKRoeeR008N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwzG3KHXtFLCecFFul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgySDaNaWRiooVqu6dZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}]