Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Seems like we need AI to replace the retiring boomer population. Our demographic…
ytc_Ugzrjnf9g…
G
You have not heard every argument or you wouldn't be talking out against ai. My …
ytc_UgynvSyfe…
G
you want to get the people in power to stop AI research? get the AI to solve all…
ytc_UgwALU9mA…
G
Let alone the fact that it over-personalizes anything it makes if it has at leas…
ytc_UgxaSXWMU…
G
@17:30-.-Couldn't this also be said for photography? Like, I don't think it's c…
ytc_UgxC-WPHQ…
G
STOP CALLING IT ART!
You need to look up the definition of the word ART.
What AI…
ytc_UgzFaJIC0…
G
Only excuse to have AI make something is either 1. Something so dumb no one woul…
ytc_Ugxdw6iWc…
G
This ones on him not the ai he didnt clarify that he was planning on eating it😂…
ytc_UgwHOK68S…
Comment
Pushing aside the multitude of efforts he is expending to help humanity, in a speech at MIT in 2014, Elon was the first person who warned us AI fo be the "biggest existential threat" to humanity. He also put his money where his mouth is and was one of the founders of OpenAI a year later to make sure AI stayed safe. Is Elon complicated? Absolutely, but to say that he has "no moral compass" is a bit unfair and wrong!
youtube
AI Governance
2025-06-17T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxG1RkVBWoaNmCDl4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy_KZMMO_ewnQmyfMF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyGCo6E4VjMZHaqaXh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwbBItojtqLyf6dPpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzAbDM3gJCKhzqXHch4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNgWhdPZX3HUPYIfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyQuAg_2fNYSkMCCIt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyREEU2GwQYbBGtG-F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzK_kPpbtVIHMaAHYR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9jMgaxfMRK7XXxuJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]