Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, you guys are worried about art being overrun by AI. That affects a tiny fr…
ytr_UgyTYedGy…
G
"Why do I need a driverless truck?"
Why do I need Mexicans or Indians or "immigr…
ytc_UgzK81doy…
G
We need to try communism- again. If not AI, nuclear war will do us in.…
ytc_UgxS9P9mZ…
G
Yang got shunned for UBI by the very demographic of workers that have and will c…
ytc_UgwvU1ost…
G
Infact ai needs a constant flow of high quality human productions to work "prope…
ytc_Ugy003Xms…
G
Just reading the title it makes me think A.I is All knowing ( for the most part …
ytc_UgzclhE4T…
G
When you say you're worried that the nightshade artifacting might look too obvio…
ytc_UgwcJq9gR…
G
Very well said, a legend in the arts space named Steven Zapata uses this as his …
ytc_UgxXIYD5E…
Comment
I think the self-driving car always should favour the lives of the people outside. Also, I think the driver should always have his hands on the wheel. If something comes up, the driver could make take over and make a decision. *edit* Let's be clear, though, self-driving cars are much better than humans.
youtube
AI Harm Incident
2020-11-26T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyr0RMpIPjrTwnqobN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-IgbdLnExAW_EhZF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxN7dBvLWbIzar2GFB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEykcQ7MVTCsx7g7d4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwvKTI5iOHqUc77Qtp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0YcSTWrba2PmZtwp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugylzv1Xaz0MgWpxUfJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw3EQZ9U6NBDWibkPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxJ57uNPwCyWaPdS5p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsG5XLEWsnnq4EsCR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]