Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be fair, I don't think AI will take over art jobs, but it will shift some gea…
ytc_Ugy7Z7q7w…
G
Make AI rich again 😂😂😂😂. People can get free money from government.. basic incom…
ytc_Ugzohbyzw…
G
We wrote tons of fiction about AI rising up against its creators and then traine…
ytc_UgxwCJHwb…
G
And there are people who think ChatGPT and other such programs can actually repl…
ytc_UgwiFn6le…
G
They can't move regulations quickly enough to keep up with AI development. They …
ytc_UgxgiuCbq…
G
I am just as cautious, if not more so, when using self-driving technology as I a…
ytc_UgwAcUNaT…
G
i've tested a lot of AI humanizers but this one really delivers! i love it…
ytc_UgxmSg_H4…
G
we need ai to take over the harder jobs that we as people are unable to do in a …
ytc_Ugzqz_KrY…
Comment
That happens, when you replace a system like radar, which measures real distance, speed or acceleration, with a AI systems, which estimates the same parameters, based on „experience“. I am pretty sure, also cost reduction played a role in this step. Unfortunately, learning from 1000th of new situations, also can cost the life of other motorcyclists or people on the road. Banana product = gets ripe at customer.
youtube
AI Harm Incident
2023-06-04T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5JvaEtbdngAYBKr54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyTd9RcmLc80IMz_bx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz0YauaoeG4gKPnGCh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw_YI2Ifg0f1gGbMfl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxqOTDgKAHDoa3cB094AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxXzGMjClFBAz7qKrZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwDmUxkMau3kOQUJwB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzi3RVNii45NRTl9Ox4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyPNJQFb_cWuftG0eR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx8lhcmmwi7e3__7UR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]