Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry truckers... But I am all for it. Back in the day I had the highest res…
ytc_Ugxat9iGA…
G
Now you may hate me for this, but hear me out: I don't think that AI models lear…
ytc_UgwqtZ41p…
G
They have to accept that AI will be part of SAG and move on. Embrace and work to…
ytc_Ugz-u_s6R…
G
Oh come on….. do you want to see Robotaxi succeed? It went out of its lane…. …
ytc_UgxvqgAEF…
G
The day a mad man will hack in to these autonomous machine's for personal demoni…
ytc_UgwlFhIG4…
G
Safety ? Is it bulletproof? Stupidest idea in the world to have driverless cars?…
ytc_UgwglINiV…
G
Another part of the short-term to mid-term predictions is that decreasing the co…
ytc_Ugw5kIsLp…
G
I doubt it. What are you building as a business without consumers and an economy…
ytc_Ugy6Auo4I…
Comment
My thought is, the driver should be paying attention to the road and always be ready to reclaim control. If the person doesn't, that means they were impaired or not paying attention, and they probably would have gotten into an accident anyway. But the self-driving aspect adds another set of eyes and a very smart ai that will only aid the driver in keeping the car safely on the road. If you become unconscious, or just aren't paying attention, or can't react quick enough, or any other reason people get in accidents, the car is very likely to save you from that. That's what I think anyway.
youtube
2023-07-31T03:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxukQjqRR6f6Xre5sF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBiMBQ5lWOhK-4mQB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxgySTwFUJ43HVWlvt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJqYuRi0G1MGQMPix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMJz-IxeP6ueISXph4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkSf9SF8XFo-Dn6xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_UNVwacIfOiICyyN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwutRFxldrAFzPqnBV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbghNUGweGjTHowWd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxyGYXtuJJyi7iO8_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]