Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think there should be some kind of ministry of AI safety and people who will c…
ytc_UgzvTclhz…
G
surgery is already robotic in most areas with DaVinci and similar robotic aids. …
ytc_UgzSXnLrb…
G
AI Will never beat his creator human, if AI can beat human, human can beat god o…
ytc_UgzuBCfBw…
G
I agree to some extent but I believe that's not entirely true because creativity…
ytc_UgyBkwzHL…
G
Le monde n'est pas prêt, ni socialement, ni economiquement, ni moralement à l'ar…
ytc_UgxwBC8qf…
G
This was shot 3 weeks ago, the information is outdated at this point. AI will de…
ytc_UgzO5Hs5s…
G
AI just has to be good enough for the general masses. We're not there yet but cl…
ytc_UgzkfS8IU…
G
Artificial Intelligence the future of humankind, Time magazine 2017 many tech co…
ytc_UgyI4FGxN…
Comment
What am I missing here? When you engage Autopilot, a Tesla CLEARLY tells you to stay engaged and be prepared to take over at any time. There’s even a camera inside the cabin to monitor your attention and cancel the autopilot if you’re not paying attention. Tesla is one of many car manufacturers with “self driving” cruise control, but it’s the only one that cops flack because of its high profile. I’m not a church of Elon follower by any means, but this is just sensationalised news when really the discussion should be that people are using this as a crutch when it’s merely a tool😊
youtube
AI Harm Incident
2025-04-14T09:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxR9I6PNNLzQleDQGN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwCVe-uLqHzpaQwRyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgweWosgEdBDbM1J9t54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVzcrXsyuBHAsJZIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwztLjEZzLc6Zu7oxp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzNVC1ydO0c-soNTr14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3JtDkh1o42myOp4d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwskBsLDHm3rIbREOF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwv_5nV_2Z909Ro70h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyE2r3cdmW6pb2iWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]