Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
well not all lawyers. but if every tax lawyer disappears. then what happens when…
ytr_Ugxxcq24O…
G
Did you forget the deep fakes used by Modi and Indian gov during lockdown? And t…
ytc_UgxhcsWkU…
G
He planning started the crazy AI was anti human from Microsoft program to bait g…
ytc_Ugwo1pAGr…
G
It's difficult enough to reason with people that AI responses are related to pro…
ytc_UgyeLEcpX…
G
To all AI haters. Grow up, learn new technologies, be smart, and try to fit in t…
ytc_Ugw9a5g6D…
G
My dad said use ai because I’m a artist I said no but it’s not cool that people …
ytc_UgxvyXhTk…
G
A hypothesized future sapient robot being offended by our treatment of current l…
ytc_UgxaNtoZk…
G
Ai in its current form does not even resemble intelligence, artificial or otherw…
ytc_UgzqGF1ay…
Comment
One thing to consider is that self driving cars almost entirely eliminate human reaction time. This means the car can make decisions significantly faster than humans and dramatically reduce the chance of a collision taking place. Granted there will be some scenarios where a collision may be unavoidable, but in practice given collision rates of self-driving cars vs human-driven cars, especially when you consider evolution of self-driving cars working together on the same road sharing information with each other in order to even further reduce risk, I don't think this will be a problem.
youtube
AI Harm Incident
2017-10-30T11:3…
♥ 22
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgziBNScDMqK7LUbKCJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaLQv3Hr9M9mYKhHV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwrr6SC95igxxi3KxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkhoJUXCOdqrrdO_F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwE0M-nX5hQSlgJMIt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyO36Rq0dURj-pha1N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzk7FDfT1Jr5iwRctZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLZAyTObU_AerGybR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxgjwd0T5M4e_QWPzp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRFrJblnQUBMcSBhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]