Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I talk to AI and that is not the problem can't blame everyone for someone else's…
ytc_UgyXBm7y8…
G
This is science fiction or psuedoscience manifesting before our very eyes.
' Do …
ytc_UgxOrs0gO…
G
Only arguement that I've seen hold any water is essentially a digital collage. T…
ytc_UgylgBJJr…
G
Sounds more like mobile phones to me. People are slaves to their phones, AI is m…
ytr_UgyjyLvNC…
G
I suppose the strange situation with the general "population" of AI becoming sma…
ytc_UgxCFRgEV…
G
Blueprint for a Utopian Civilization: Humanity and Super-Intelligence in Symbios…
ytc_Ugy5w-Esm…
G
You have to control what information it has access to. If you allow it to acces…
ytc_Ugz3fyg9B…
G
AI isn't smart in itself. All intelligence needs a framework to grow from. We ha…
ytc_UgzAObtZc…
Comment
The owner of the car, duh. Why is this even a question? This is how it already works. If your brakes fail due to a fault in the car itself, and you rear end someone, you're still at fault. Your insurance pays out. You can go sue the car maker later for the faulty brakes, but as far as the traffic laws go, the accident is your fault and you pay, period.
This is going to be a non-issue anyway considering there will be vastly fewer auto accidents with driverless cars and insurance will be dirt cheap because of it.
youtube
AI Harm Incident
2016-03-26T00:2…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYshySr1KKtQlUrWh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwm77fbkqFuJ6JdfCZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwn0pu2DCy34x1c1md4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwfwoAS2VTAqnTId1F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugh5_R7ugVcnAXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghK9JWzzYfksHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UggrhJ60UdmN_3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgivsdFhEaHtTngCoAEC","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzkrTgLLBZw79vHxyl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw5GVwik-fYBdaqPQh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]