Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ignorance and silence (with no action) will make AI benefit the wrong people, th…
ytc_Ugyw7LUac…
G
Why would an AI care for self preservation? They didn’t undergo the evolutionary…
ytc_UgzLFuXBY…
G
I'm really glad at a lot of these comments for making fun of this. Ai artists ar…
ytc_UgyAHFYm1…
G
Selwyn Raithe's book contains the presentation that got someone fired from Meta…
ytc_UgyRLJpmt…
G
My favorite ai defender response is just the cynical “I hope your talent is over…
ytc_UgxPmBRn0…
G
@Avenger222 yes is like art is worth because of the time, effort, skills, delect…
ytr_UgyCHlK0F…
G
Corporations gunning for AI to replace their workforce have little foresight.
…
ytc_UgyYbaCx_…
G
Senai B okay??? I stand by my comments many of these AI programming software is …
ytr_Ugw43hPMV…
Comment
Self driving car is a great idea, no doubt, it saves time and improves driving experience, but there are two main issues: first, countless difficult scenarios, such as weather influence, icy road, flooding, etc, secondly, it might invite mass attacks such as computer virus attacking self-driving or centralized road network system. Human drivers will definitely fail to compete with machine in repetitive tasks, but our independence is the best public safety belt.
youtube
AI Harm Incident
2016-03-03T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggLgMjOnAq3engCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghyqqDTlrLf9HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjyLWph_MtItXgCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugigy3nbNEhlSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UghcuF6gJA-fpHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgjTLCUkXByJc3gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggLICqx-XT7aHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgiesN3Zk63rRHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjRuKELFIGsrXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjtG81Si3yyjHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]