Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you Mr Elon Musk , wisely highlighting the AI dangers on public risk,the n…
ytc_UgwdavOLI…
G
"Autopilot" is not the same as "Full Self-Driving". They are totally different s…
ytc_UgzmPvF4s…
G
*ehem* let’s not discuss how much art ai has stolen. Artists don’t want their li…
ytr_Ugy9PcIT9…
G
If anything isn't he stealing the AI's hard work. All you did was tell it was to…
ytc_UgwLzCXys…
G
@thewannabecritic7490 The platform where this pointless 20 min video is hosted i…
ytr_UgwbLMH0x…
G
Good things come from bad things, and remaking this ai image is a better way rat…
ytc_UgzgPtwsJ…
G
@laurentiuvladutmanealol AI isn’t a person. Where do you come up with such nonse…
ytr_UgyjSJwSS…
G
RMW/RCRA transporters will never be automated and it’s a growing business; it’s …
ytc_Ugwhfg1Qb…
Comment
Why isn't the news for the Boeing 737 Max mention anything about the use of an AI system? I only knew it is because of software faults. Boeing should be more transparent with this AI system usage. They should at least use and test flight more than ten times before selling it commercially. Why the design faults only happen on two third-world countries? Everyone has the rights to know about the AI system used in this plane that they are on board. They should announce that this plane uses an AI system. The thought of AI's lethal weapon really strike a core to humankind's fear @8:01 and the commentator highlighting @8:25 was scary.
youtube
AI Governance
2023-03-19T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzLVFjDsJGJD471yYx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxSAtT93udw6ci31_l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxWhzK0jyYuQmh5LBZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRgjZzrNT8Mr7XtKd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxh8YHVVmyk3PXDs5x4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyu3PyJSQBjOjguNYh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwZEKFaF2rZvSIRUZF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxkbbBer6DnqJUfo_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwni-8yjbvjREV2luJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy2dkMvZhjlMtODkGx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]