Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well as a future programmer I'll program the option that avoids killing or kills…
ytc_Uggfmpuz0…
G
Where's The Wonder..?..
He Stuck His Head On A
Robot Dog & Wandered Down The
…
ytc_UgzYBu3-E…
G
Instead of Universal Basic Income there should be Universal Basic Facilities i.e…
ytc_UgxRhrR2T…
G
Ai isn't smart since we use it as fake stories on tiktok. Why would i believe th…
ytc_Ugz7ijVHX…
G
It’s already happening by not AI but our ‘so called world leaders ‘ !
🙏🇵🇸🙏🇵🇸🙏…
ytc_UgyqFmQMg…
G
Why can’t I buy a Waymo car and have it self drive me from Sarasota to Tampa? My…
ytc_UgwXhmege…
G
The only time I used ChatGPT was to ask if Epstein killed himself and it flagged…
ytc_UgwqBy0Ec…
G
@pierrex3226 WOW-LOOK WHAT AI TOLD ME ABOUT THE JOBS AI CANNOT DO!!!!
Based on …
ytr_UgyCIKQru…
Comment
Aauuggh 🤬 I saw this, read the referenced NYT article, and sent off emails decrying AI research cuz of bunny-boiling Sydney and the "dark side" of AI. Meanwhile, it isn't HAL or psycho AIs but (as I always suspected) Humans we have to be terrified of.
I read the full transcript of Kevin Roose's conversation with Sidney and he only quotes the wacky parts of their conversation. Of course he does. Who the hell is going to blame humans for annihilating the planet if we don't point to other suspects? Besides Hecklefish...
youtube
AI Governance
2023-07-07T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxePb5diqL3xiBvP6F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuhTSucMWjUaXN1-F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxJ4iJgMEBce8R2YBt4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwaRZGtpTIR7iIjIuB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-hb_drdA3Y7oirGN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxbf-KUgkHNB3bP4k54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweZHA3mWu_b-Dc-CR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQnBro5pK_L-iBHsd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz43aJp_VBILpjCOnl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCOHClH7k4AW_r4ZF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]