Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
2:48 literally same structure as a drug dealer…..
AI is a literal drug and using…
ytc_Ugx_6Out0…
G
i cancelled paying for chatgpt already for these exact 3 reasons.
also dumb sh…
ytc_Ugy9VQ0Tz…
G
Well yeah we still don't know how the brain works exactly and yet they're trying…
ytc_Ugw1Xt9-0…
G
We need to put guide rails on AI. Moral ethical ontological guide rails. AI can …
ytc_UgxxObidO…
G
Rev 13:15-17: "And he had power to give life unto the image of the beast, that t…
ytc_UgwHEcIlH…
G
One thing i dont understand. In order for AI to innovate non stop, we would have…
ytc_UgxCDF_U1…
G
Pausing at 15 min to knee jerk react so probably off. Philosophical rigor is act…
ytc_UgyqBWaCr…
G
Mark Rober politely destroyed Tesla's claims to be better at self-driving than L…
ytc_UgwEA79Kn…
Comment
I'm irritated that a hypothetical situation with the self driving car still has a truck that is irresponsibly filled with cargo that can fall out. The computer of the car would sense and object in front of it faster than a human reaction time so it would stop. Now the vehicle behind it whether it is self driving or not would have the responsibility to stop in time because that's how the law works. Plus if you are in a self driving car you have plenty of time to call the "how's my driving?" number or 911 and report the unsafe cargo before it falls out and kills you.
youtube
AI Harm Incident
2015-12-08T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiyupRVtlWBhngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggyEI8_YHbKA3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi3Gjq5meodMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghoyPd4-QvbcXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugis53FXvmFe9XgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjQeVmvXPf4K3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_78FWydk3dngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjMGlt6fG9gKXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjGMferoLg5VXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiJzHt8WvuEHXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]