Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi Jeet, you got the right answer. Kudos.
The contest is over and winners have b…
ytr_Ugx9VIO3G…
G
it wouldnt have been so funny to the guys if they were making deep fake gay stuf…
ytc_Ugz-v6NEr…
G
I always thought AGI would be air-gapped. Apparently we just keep it on the inte…
ytr_UgzUzSZ3C…
G
The term "hallucination" is inappropriate for generative AI. Since AI is not con…
ytc_UgyojNJPB…
G
I remember going to a trading card convention that has primarily Pokemon cards b…
ytc_Ugy2kD9F0…
G
I have a feeling that this entire video is created by using AI scripts and AI-ge…
ytc_Ugxph4S24…
G
The main reason why people are angry with AI art is because AI art don't really …
ytc_UgyCHlK0F…
G
Mexico 🇲🇽 americans .Spain americans. Thailand 🇹🇭 americans .are you livin…
ytc_Ugwrs_CK_…
Comment
"Instead of getting distracted by future risks?" Are you kidding? You are trying to distract from the existential risks so you can get airtime.
Of course current AI has problems. No one who is trying to raise awareness about the non-intuitive dangers of AI would disagree. But you are pretending that AI is the same kind of tool we have always made, when it is definitely not. In under a decade we went from 'it will be decades or centuries before machines will out think us' to 'in a few years machines are likely to out think us'. They are not slowing down, in fact, they are already using their superior intellect to design improvements in themselves we couldn't think of.
youtube
AI Responsibility
2023-11-06T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxAL0ukzbZEdTZPQtx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzfXn21BiurL1tmsaN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwqts5VyG_N63DRGOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxATXJqvYGw0mllYz94AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzj1-gEV1upbIGfUiJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9_f0tMSf0H1-iO9t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGZ-KfndIdSmbe9HR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxutE1MxXKGjVrKaaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwwIuvNe3ezMXQ8-jV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz0Bsx5UBPoY5n2SRV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]