Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@charstew3207 Useing an ai program to harm ai programs. but one is much better i…
ytr_UgxfIDJyH…
G
First Ai will help riches to get more leverage on the rest of the population, wi…
ytc_Ugz9cWSXp…
G
We’re not going to combine the human brain and AI to make a superhuman? AI just
…
ytc_UgyHhjBJm…
G
I told you I told you 100 times this car Tesla is a failed system. …. That’s lev…
ytc_UgzL9eggQ…
G
The lack of human rights creates dysfunctional societies, so giving rights is no…
ytc_Ugg_-pM4p…
G
bhai entrepreneurship aur tech world mei bht difference hei. AI abhi se hi apna …
ytc_UgzWBkQ-D…
G
I am a pro-ponet of face recognition software because break in attempts like thi…
ytc_UgyXwccYL…
G
Seems super AI might want to stop humans from doing a nuclear armageddon, which …
ytc_Ugxp927l8…
Comment
They want to have Driverless Trucks...... Ok but what if a Child runs out in front of one of these Automated Trucks and it can't stop in time and kills the Child ? Then what
What if a Freak High wind situation happens and hits the Truck broadside and flips the truck over on the interstate...... No one is in the truck to answer the authorities to what happened. You tell me.
They have already learned how to Hack Cars and have Proved it at DefCon in Vegas. Catch what I'm saying ??
The simple fact is that this entire Automation of Semis is going down a dark road that we don't need.
How many deaths needs to happen before they pull the plug on an idea that's only going to lead to more problems than answers.
You tell me that one.
youtube
AI Jobs
2025-07-29T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgymIhiB3kuuEHqVgYd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxXgg5K6x7KGWoRPsZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxI8W2YShve-qQyyd54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgzVsIOLR3G2FauMi2x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzTjcPHAddf4kpeqhl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgxGNUnFJWtWW6RFZrd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzGEro3sHcBO_DsEzR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugyf3T0EXaLdk4-Ozkp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgzEiSIkR5wVrkpB07x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugxt6dxSoznOm5NzpHJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}]