Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have trained my chat gpt to write to get the lowest score possible in ai detec…
ytc_Ugx0QRTwr…
G
Me if someone sees my chat with a character ai of shadow the hedgehog-: 👁️👄👁️💧…
ytc_Ugw3FIzct…
G
For sure AI is bound to become powerful in the near future it will be integrated…
ytc_UgzloT9d7…
G
With the rapid advancement of artificial intelligence, many are pointing to the …
ytc_UgxqEdNv-…
G
Artificial intelligence only knows what it's been told. Sounds like you need to …
ytc_UgzwHtJZp…
G
Your claim is incoherent, because learning and being inspired by a human is not …
ytr_UgyNEs075…
G
I’m no expert or any training how to make these things, but could the hallucinat…
ytc_UgxlZOmgi…
G
I can tell you one thing; when FSD is about to crash, it automatically disengage…
ytc_UgyyZghA6…
Comment
The main problem with self driving cars isn't even technical or the fact they are not 100% reliable. They are already safer than people are on average, but they are still not perfect & accidents still happen. The problem with self driving cars is who to hold liable for a collision. The most basic thing you have to prove in any traffic offence is driver, as any traffic cop will tell you. This is the real reason driverless cars are not here yet, and why they most.probably won't be for the foreseeable future - even if all the technical issues were fully solved.
youtube
AI Moral Status
2025-04-27T10:0…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxXyHi07JbUJV5WRyh4AaABAg.AHOl9YNNHrTAHOnu9timff","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxXyHi07JbUJV5WRyh4AaABAg.AHOl9YNNHrTAHPV1bxyD0m","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyCpaxvOq93JVngQKF4AaABAg.AHOjPO6UjYlAHRAQWzRSkZ","responsibility":"investor","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyCpaxvOq93JVngQKF4AaABAg.AHOjPO6UjYlAHRNnoscj6K","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgznuBOtk_zcMDWhIc54AaABAg.AHOjGZ1kGPxAHQ3DQpOViq","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgznuBOtk_zcMDWhIc54AaABAg.AHOjGZ1kGPxAHRilTlUDbf","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugxber59mLHvkP1w0Td4AaABAg.AHOfuKMoZivAHPSd07TS7b","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugxber59mLHvkP1w0Td4AaABAg.AHOfuKMoZivAHPbCM7SQms","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytr_Ugzep7LKNulwIY71soN4AaABAg.AHOfhV1UaDHAHOgKgr_n68","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugzep7LKNulwIY71soN4AaABAg.AHOfhV1UaDHAHR8zUiNcNM","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]