Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would love to have cars that apply the breaks or sound a warning in the car w…
ytc_Ugh4WDO7o…
G
7:20 I can't believe this guy. He says it's not art if it's not original? Then w…
ytc_UgxlCINZv…
G
Id rather imagine a world where ai replaces them and it will. This tech passes t…
ytc_UgwXiGa0I…
G
@BobmiZeBobmi yeah I just searched it up and indeed a Tesla robot did attack a w…
ytr_UgzeT_lK-…
G
I'm way more inclined to believe someone cloned a meatsack and trained it on the…
rdc_oh28njp
G
It’s a bit funny to hear a Chinese person say that AI is great because they can’…
ytc_Ugx2W1afc…
G
Yes Only the AI is an artist but what you artist are failing to notice is that t…
ytc_Ugy75B0p5…
G
AI will take over the world's resources and jobs. Humans will be useless. What w…
ytc_UgzKQf_dv…
Comment
I somehow listen to Geoffrey and think he encompasses humanity so beautifully. He is eloquent, compassionate, morally upstanding, well spoken, intelligent, analytical and seems fundamentally kind.
In short, Hinton’s concerns stem not from fear of technology itself, but from a sober recognition that with great power comes great responsibility - and the systems we’re building now might soon surpass our ability to steer them. His warnings are a call for humility, caution, and proactive governance in the face of transformative change.
youtube
AI Governance
2025-06-17T05:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxqNc2i5-uKafsJ9-N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtN-IjtBBpVv3Ugdl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwYbRWOFmbNh4QNTRB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTpecjewLSL1AAKGF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQJu5vk3tslmruuxd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzpi_qpkAmDy58YyUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCMSUvHIy_DloGWQ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyoeERaLSu-2gEwQjd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzAwVb-RmeMQLobX254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWM39ZgCVQSIPG2Sh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]