Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Q lindo futuro tenemos. Ojalá q se la quiten con nuestro GOBERNANTES VENDEPATRIA…
ytc_UgyZvR4MR…
G
@_B_E With a 3D sculpting program you're still choosing how to sculpt, what to k…
ytr_UgwqQHIx-…
G
Like all tools, I think AI can have a positive impact. However, it's not the sam…
ytc_Ugy6Dnkkp…
G
Just because a bunch of pathways can operate a language doesn't mean that it is …
ytc_Ugx8WFtjo…
G
The cat thing wasn't a smart choice AI:
Do lobsters need water?
AI-overzicht
…
ytc_Ugz_SsUfA…
G
Damn..... Can't find a reason to give a fuck I don't need an ai to do my shit fo…
ytc_Ugwfgpm2e…
G
I don't agree AI drones remove moral cost anyhow. Same way traps and mines would…
ytc_UgzxGKMkg…
G
I’m sorry? Excuse me, but if us artists weren’t here right now.. no one would ha…
ytc_UgwhNioRm…
Comment
The scenario is not realistic... the speed of the boxes should be taken into consideration and the car will stop in time without hitting anyone.. not to mention that other cars will give way. Not to mention, there will be no bikes. Not to mention safety distance taken by a robust is far larger than a human, not to mention low speed and safe driving rules programmed.
I would go as far as saying driverless vehicles will be at least 100 times safer than human drivers.
youtube
AI Harm Incident
2018-12-07T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyReg2RJcQbRU8fXqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFCEuEWdDiAtznUXV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwg9zPgDxoVHbvC0MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyNM-AKHsF-2MXWWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8Kcl0btcr5I4ySJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6kL7XHAZLi2NywJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgRIau2zSrD54ZIb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgZBpAS47AyZs-L4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfBkZSlB2KJ9236h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgpFP6xoDLiHG7IYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]