Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ya were all about to get replaced by robots. at which point ai will probably jus…
ytc_UgwQstxTd…
G
Well as a future programmer I'll program the option that avoids killing or kills…
ytc_Uggfmpuz0…
G
How are these big companies (who can afford ai machines to replace humans) gonn…
ytc_UgzDfHh7H…
G
As a musician indeed I feel robbed by AI of my 20+ of musical studies and practi…
ytc_Ugz01f_BL…
G
its fun and games until the robot doesn't give the gun back and shoot the man an…
ytc_UgwIF4T4i…
G
chatbot: What do you want me to do?
me: get a moral code. answer all q…
ytc_Ugzept16x…
G
Ai is useless and humans do not need it to exist! We were doing fine without it …
ytc_Ugz2tAPrF…
G
Being polite doesn't cost anything and it makes the world just a little kinder, …
ytc_UgyWh5E_c…
Comment
Actually it's not dilema considered if it is controlled by drivers instead. A.I can programmed to choose the action that priortise saftey of passenger first, followed by best outcome by minimising casualties, at the end the ultimate fault goes to the truck driver since self driving cars would have on board cameras to record evidence plus event logs for making decisions, in which it will be solid evidence of accident and much more trustworthy than human witnesses
youtube
AI Harm Incident
2017-05-30T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghMzFQ5uciXyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghSVao5v-7LzHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghWZB_DNXhaTXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj7Q3CElinFQ3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjTiwroBtb2T3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjIFRBxgjA2tXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghlT0jEO-duZ3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh8Tr7F8wrmeX3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugi0hd2FnlV7Z3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugil9BPZ0b0LongCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]