Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Of course Donnie isn't going to meet with Zelensky, he is far to busy licking Pu…
rdc_mcrt41s
G
He meant to say "Beg"
Because that's the only thing in which AI won't replace yo…
ytc_UgwQIxTRq…
G
Unfortunately this is happening in the rail industry freight trains, operated au…
ytc_UgwZIOx_b…
G
how delusional do you have to be to post AI slop with no soul and think "yep. th…
ytc_UgwWlBPdF…
G
It's fairly simple for governments to say, your company must employ X number of …
ytc_UgwTPCRpk…
G
I can see why you might feel that way! Sophia's responses can sometimes come off…
ytr_UgyElFlqv…
G
If I can buy a robot to fix cars, #1 I own that asset (the robot) which is going…
ytc_Ugz5R17XI…
G
Bottom line AI is smarter than humans by alot. Itll plan it out so meticulously …
ytc_UgzEn6DHM…
Comment
As both a rider and a Tesla owner this video pains me. Not because I want to fly into a rage against "ALL YOU ELON HATERS!" but because it has a lot of truth to it. And it is precisely one of the reasons why I don't have FSD/FSD Beta or use Autopilot at night or when I am being followed too closely. Ultimately I am responsible for what my car does. We all are. The automated features are nice but they have a real long way to go before I will feel confident that they won't kill me or somebody else.
youtube
AI Harm Incident
2022-09-03T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx8moUl6EJZkFS7P714AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy0AP_Pp3vlulMJqtZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-yOEoBIQp4Qzjy1N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgymMIWpL9GSaG1-2wl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzxqt7bVMnYuRUZLd14AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwUVfwXku90xKirGgJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz2DvIMOu6H14tQOeF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzreR0wjmEax0vs9254AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyN6Oth4LTS1Ixa_Bl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxAiAEwArnDsRwwQCx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]