Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will never be smarter than humans or dominate them. Period. However, I am wil…
ytc_UgyNAiffU…
G
Those data centers from Amazon already exist to power AWS infrastructure, but wi…
ytc_Ugwts4lvU…
G
**Title: The Beauty and Practicality of Mathematics**
Mathematics, often regard…
ytc_UgwnaOYYj…
G
This is propaganda attempting to relate AI with cheating. Cheating and AI have …
ytc_UgwClg70s…
G
Why are we trying to create something that is more creative than everyone? And h…
ytc_UgxtEnF2R…
G
I dare people to ask the same thing to chatgpt and I will bet you don't get the …
ytc_UgwR74Nf7…
G
@emz-h Who the fuck cares???? Nobody is trying to force you to use AI, and if th…
ytr_Ugx14lGvy…
G
Proof, AI can/will be used to push humans to experiment on themselves. AI can't …
ytc_UgwmP6OPS…
Comment
In my opinion, I think that the car would not be in that situation in the first place. It would know that you need to be set distance from a lorry. But to answer the question, I think the car would break hard! The self driving car knows to never hit another car. It would do its very best not to.
youtube
AI Harm Incident
2016-12-20T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiDRHNP6Ll3F3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiIYchWvUGckHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjiR5ifVgu5L3gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgiJaxBMly9MvXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghSuhCsL9iAHXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghAA7dcebmab3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UghAsDUNhcPf4XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_4HU5JSF7SngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugiufh1PTT6cmXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UghXJhtXibHvFXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]