Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So true, my first was roleplay, succesful, being errand-boy with bodyguard to be…
ytc_Ugzkco-BK…
G
I get personally irate when I call for assistance and it’s a guy named “bob” in …
ytc_UgwRxTSpM…
G
@legion8328 No, you don't get it. AI EMULATES emotions, it does not EXPERIENCE t…
ytr_Ugy-pZgqp…
G
In the first example, if you would face your inevitable end, would you quietly a…
ytc_UgztSyy37…
G
I am a robotics engineer and it is not possible for a robot to automatically do …
ytc_Ugx12MlLZ…
G
Interesting video. I was kinda waiting for a video about AI from the depression …
ytc_UgzACBaVc…
G
You only need to press some keys to give AI a prompt. You guys know what that me…
ytc_Ugyrab-kD…
G
Hype train has been dying for some time now. Google and Apple have already put o…
rdc_nc7cdmf
Comment
autopilot, or FSD. FSD is their advanced software that is capable of preventing crashes, which is the version that Tesla advertises as safer than human drivers. Autopilot is simply cruise control with lane assist, and does not prevent crashes. So yes, if you use autopilot, you may crash. If you use Full-Self Driving (FSD) then it is 10x safer than a human driver.
youtube
AI Harm Incident
2025-07-09T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzS22curcVLHYMeEcB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsyVuEt91IuC1Anu14AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxHA0BXHosVnTnJAcJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyH1IKZealXA9AgyGB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzaC1hTZaF9xfdvZed4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxMZG2fgLSAsYeM11F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV2_xAOnxtyHzSz8V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkVupxOSMiKG2C3OZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyE61DIxfHpgbTH2ux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyx9kaZHh97qBg2iGB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]