Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But what about sentient's that are nothing like humans? For example if someone w…
ytr_UgixNOeeO…
G
That is true, but only in the near future. Further ahead, the future is bleak fo…
ytc_UgxrFAdir…
G
The ai hatred cope is strong with this one. A soul is easily created with enough…
ytc_Ugx1zfFKW…
G
People just don't respond to any AI communication will be a start to shut it dow…
ytr_UgwbMGmrX…
G
That AI ASMR was interesting, but it really shows how much brands need to track …
ytc_Ugzc2TDqL…
G
Will sex robots have AI so you can talk to them - would be hilarious…
ytc_UgxiTwSt1…
G
I'd like to argue the absurdist point of view:
Are we not just a bunch of eleme…
ytc_Ugz5jjAsU…
G
Great thoughtful comment. Observation: watched Rick Beato do one of his top ten …
ytr_UgzMmLfe5…
Comment
I do and don't agree with a few things mentioned in this video. It looks like they decide who they can sue. It's easy to blame one side (Tesla), rather than the other side (driver of car). As far as I know, the driver still has to be in control (can't go to sleep), so why did the driver not react? I should say drivers since it happen twice. Radar needs something that reflects the signal. It has been said that a Corvette can't be "seen" by the radar of a following car with adaptive cruise control. With adaptive cruise control, the driver still has to steer the car, but the car keeps the distance to the next one automatic.
youtube
AI Harm Incident
2022-09-16T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzEzw4ccp8J7Arzvdp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwBXV0RxeUpen6ZGbd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyoK1cpLNqaZV80fHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRyADljOnn2vgpll14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEY4gO8WBaJjMQscB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzywqUx_ffVWDIcZmJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0ohamZ0X3Wpqd5zR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy-qZSzVScDwQXFVAl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwlUkkEdenbwVNiqg14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy9nu5XiXK84cthkE14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]