Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm sorry to hear that you feel that way. If you have any specific concerns or q…
ytr_UgzL7Q2SU…
G
I remember an article about a decade or so ago; the message was by 2025 there wo…
ytc_Ugx4190as…
G
You are godlike as in susceptible to human expression so ai is scribbling of pas…
ytc_UgxNSXKik…
G
I think, purely legaly thinking, not moraly, if the AI can be used to reproduce …
ytc_UgyF_MD1_…
G
At this rate we should destroy these general intelligence AI’s now before this g…
ytc_UgyVG-74W…
G
How long before an AI encourages someone to harm another human being, perhaps to…
ytc_Ugxod2zCV…
G
Bruh why does no one realize this AI ART IS PROGRAMMING NOT ART WE ARE GIVING IT…
ytc_Ugy754zm4…
G
Are usually late to the game man I already told artificial intelligence to embed…
ytc_Ugz2QWbjc…
Comment
I think there is three simple ways to solve this dilemma.
1. Buyers of cars must choose options of car's behavior while they buying on their new car. This options must be documented with signatures and witness. So that responsibility will be on car owner.
2. Car companies should build cars which looks the same, but programmed in different ways. Manufacturers should build some cars with drivers life priority, and some other with other's lifes priority. So buyers can simply choose lovely car with preferred behavior. So that responsibility will be on car owner.
3. If all cars would be self-driving, than they can communicate with each other (but sadly can't communicate with pedestrian). And they could simply prevent deadly accidents
Moreover different countries and cities might create policies benefiting those, who drive cars with other's lifes priority
youtube
AI Harm Incident
2022-06-08T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxi9edyRH6MBe-gmlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyHzAmobw_w11Mnb-d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxyiKB7SdSvnhxtM-N4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz4tLk_cr4X5Hr_e5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDx5EZ27hVBDO-G3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy52cuaNxv0pCVCSnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbIlfqTdOKTNc9ph14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxh5MKYmtMPkv_l1S94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlSt_4SUBX-NmdT0p4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx2arFYsbTn-QoyeUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]