Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Get enough self driving cars on the road (as replacements to human driven cars) …
rdc_cpnipbi
G
I've got to say no to this one. The AI and the programmers are in no way respons…
ytc_UgxHxV69L…
G
they could be engaged if they wanted to. it's not like their research scientists…
rdc_grrmpvn
G
@inlandish Yeah, nowadays most of our interactions with the internet are in some…
ytr_UgxKPhFmV…
G
No he's fighting a man, its edited to look like the puncher is a robot…
ytr_UgxymDqda…
G
Perhaps you'd like to go back to having human switchboard operators to connect y…
ytr_UgzlHoo4e…
G
Are you a bot? I agree with you but this phrasing is so bot like…
ytr_UgytIIAId…
G
Man, AI for coding is irreplaceable. I can start a project and have a working pr…
ytc_UgxZLvOZI…
Comment
Excellent piece. AI is not ready for prime time. I have turned to minimum the few 'auto-assist' in my car. Why? Because when i am in the driver's seat, i have one job only: drive the car. No home entertainment system, no phone, no meals. Just drive. On my bike, i have to watch both front and back. It's a lot of work but the only one keeping me alive is me. Tesla, and others, are making lazy drivers which is a detriment to everyone. There is little incentive for drivers to pay attention because "the car will save me". All this increased 'safety' is making worse drivers and less safety. Maybe fewer deaths but definitely more crashes. Again, great video Ryan.
youtube
AI Harm Incident
2022-09-03T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywKea9FVDKRWT2zkt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwapqUnz0wZE27PIrl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxDyofChHc-RsCUW4J4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxg0br98FBuetdkGBB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy-fViY6LSffaFLoQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBn55g57TjtKg3HCB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugytgd9wUZAC6X9AjTt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwthx6XQkQcPy8-Ld54AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxLVyHTmat2UcgQ4e94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9dwS2vlANaZ76PvF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]