Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol I have skill so don't use device that helps those with no skill this is the …
ytc_UgzBHfJiE…
G
AI art will never truly replace real art, it's too inconsistent and doesn't prov…
ytc_UgzrA3tR6…
G
Okay, hear me out…….Is AI extremely dangerous, yes of course. But based on what …
ytc_Ugzz2MYU3…
G
Scary. Ai with the best communication skills. Of course, it will be the leaders …
ytc_Ugy0gpK9l…
G
Blue collar jobs will eventually be taken by artificial intelligence because the…
ytc_Ugz9TItLQ…
G
It’s funny knowing that the US and other developed countries outsource their gre…
rdc_gx6d445
G
"AlphaFold is only good at one specific thing" (14:51) - this is the outrageousl…
ytc_UgwRcfrDA…
G
Haha, that's a good point! Without battery power, Sophia would definitely be qui…
ytr_UgyFRct9A…
Comment
Geoffrey Hinton is really good at seeing the wider negative implications of AI, but the problem I have with this interview is that he doesn’t seem to master the details.
However powerful his arguments are, as soon as you get to the details, you start having doubts about the premise for the argument in the first place. This is a shame, because the topic of the dangers of AI is incredibly important.
youtube
AI Governance
2025-06-21T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugye_v2z2tT8EpI5L-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrbzAiENZxFzU8mLd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7NLvq3A721E9LXTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxO6Sbsj_gKGzxh-od4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyR14L7oo774awtR2l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNt92vBOZDGd7zJWp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzaiBGhOTvArgTp2Sh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwchOyvq_ZObg6nafR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyj7iXTvF3IXNPm1gh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzkGjerNIGBxA-M6P54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]