Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not to be too negative right out of the gates, but considering how everything is…
ytc_UgyuOcsT1…
G
Art styles change, if you look at the first video she uploaded on here and compa…
ytr_UgzYabhWl…
G
In fairness, what’s the source when the AI chief at Microsoft claims that most w…
ytr_Ugy1-pU4D…
G
Computerphiles stamp collector example is a pretty good one. At some point an ai…
ytc_UggDWsBkR…
G
Learns by stealing from other people and jumbling it together is how humans lear…
ytc_Ugz3eqV1R…
G
On the parents' side AI chatbots are fundamentally flawed no matter what guardra…
ytc_UgwrSMSYI…
G
i have a theory. ai mirrors what we do when we chat with it, i think it identifi…
ytc_Ugw8lNF-y…
G
If your serious, this is the point Americana become separate but equal. As we a…
ytc_UgxTc7-vx…
Comment
Not only will it be long term, eventually cars will drive themselves. Weather you like it or not. The driver is ultimately still responsible for what happens and they should be prepared to take over at any time. The accidents are happening because the drivers are not paying attention and taking over when the system makes an error. The problem is two fold. 1) the marketing is misleading and 2) The drivers are not paying attention as they should be and as a result are unable to intervene to prevent the accident. The marketing needs to change…its not full self driving…and the user should have to complete education on how the system works, including its limitations, before they sre able to use the features.
youtube
AI Harm Incident
2025-09-28T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwuQRD2kupySFWdpTh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfovMhdm3eLeQV9G14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzXhs9F4VXJJur7V4d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxm7nng80gPtO_qWEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznIuALORolMMREhxh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRAWB1j9sIV36PwAB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4a-Xy9TqWSUF5xWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwnoKYJtEYMF6GUwbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwvKXyJ3tTKFOnqNpd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1gx3jQu_InJLPGqp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]