Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.I. in general CAN be beneficial
But in terms of making art, yeah it’s unreliab…
ytc_UgwLCc7-k…
G
The scariest moment will be when an AI spontaneously starts asking its own quest…
ytc_UgxTef8gZ…
G
Since the art was generated by an AI that was trained with stolen material taken…
ytc_Ugx4KHfRl…
G
I work in the building industry. It's going to be some time before I'm competing…
ytc_Ugz_eh3cy…
G
Self driving cars are openly a future now. The only question is how manufacturer…
ytc_Ugx5GQ9AU…
G
This argument doesn’t work with anyone who doesn’t see art for what it is, art. …
ytr_UgwqKopmF…
G
Luke Smith calls out the AI hype bullshit.
AI is a Nothingburger. You're wrong.…
ytc_UgyEcvQXS…
G
I don't know what the "biggest danger" of AI might be. As the woman in the video…
ytr_Ugx0yZ1hU…
Comment
Elon Musk is already among the first on the cusp of a best answer to risk from AI called Neuralink leading to a merging of Humans and Ai. It will be us and we will be it. First, we should use AI to reduce human aggression levels and increase intelligence; and then, to merge. The power of future AI and technologies will be so powerful that the only option will be for mankind to become the meek that inherits the earth, or we will self-destruct. Sadly, most media accounts rarely mention merging as an answer to saving mankind from future AI amok.
youtube
AI Governance
2023-12-04T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyyXjTzLPsiVvs1f-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdrxT1yIN3xcrj4q94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxd1vvjnuueT7e6dPt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzWWlzCnaE1ul-0zaZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxk8hyXb-KQj8RqVJl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxazyOYSiLsGQM2x9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRRuR2zqqe1_Om22x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz93K-qM5KUdXOFvtx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwHb1tUzIvVBygiQRd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySoa9MwZHrkHkuDaN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]