Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ekgr212 Except, Musk is actually working on brain chips, and investing in AI as…
ytr_UgyxlWNtt…
G
She literally did not answer the first question. Unbelivable. What a hypocrite. …
ytc_UgwICRFcF…
G
No-one ever really speaks up about whether its in dispute that this is a glide p…
ytc_UgzTofsdy…
G
These interactions can seriously affect people whether it's about AI or not can …
ytr_Ugz0XdX1w…
G
Somethings interacting and manipulating our world and it's obvious. I bet Thiers…
ytc_Ugx5TomIZ…
G
I’ll say it again, but if so many jobs are lost through AI, who is left to buy s…
ytc_UgybiY0pL…
G
Real problems look like this:
take this APL program which is executed via a pro…
ytc_UgyVDpkD5…
G
@cgeditsco me AI bola
Google ka wo feature hai
Ai password manager means wo apne…
ytr_UgyEcMF0E…
Comment
If you think AI is funny and all, and positive, and, as another guy who created it said "a saviour", then listen to 42:47 "So what remains?" "maybe for a while, some kinds of creativity. But the whole idea of super intelligence is - nothing remains. These things will get better than us at EVERYTHING." Then the question the host asks, is very legit: "what we end up doing in such a world". (...) He answers: "well, if they work for us, we get lots of goods and services for not much effort [giggles]". (...) and the bad scenario: "why would we need him?"
youtube
AI Governance
2025-08-06T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwETBlQLgxP-Io0X094AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzv1nrBsOg95iNKAbJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5uRRkxRrey2X9D5l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxIZucn4rxC7MMi0Dt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyv1DFh7UKy1g7taJZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvUxKJINCrOCdS2s94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx2QvQslZZzpgr9NuV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxS6dCx4FOXmMZKNA94AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymGti998p-X61eMDN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzWfv7AsvLTGj3ZQ0V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]