Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i use ai to write scripts and create characters because my mental health has lef…
ytc_UgzXmFjOh…
G
This is definitely by design to break the will of creatives who make meaningful …
ytc_UgwDo0f5T…
G
A lot of media have made a big deal out of that wavering steering wheel. But it…
ytc_UgzG3eJD4…
G
You should look into the book, The Revolutionary Phenotype. The book is about, e…
ytc_UgzTkhmfN…
G
Of course you can't copyright AI images. By definition they are not art - in the…
ytc_UgzVPaWqJ…
G
It's all fun and games until you realize he was being serious about having a dro…
ytc_UgwyVlYCQ…
G
In that last part about copyright you speak as if artists are unable to create a…
ytc_UgxtFzV1k…
G
ive played around with ai art.....its funny seeing you create say a disney pic a…
ytc_Ugwd-15CH…
Comment
First, I'm so sick of inflammatory, sensationalistic headlines and video titles. There's much more to this video/interview than the quote gratuitously taken out of context. Second, for a Nobel laureate, Hinton sometimes gives bad answers to questions with obvious "right" answers. For example, "knowing what you know now, do you regret the work you've done on machine learning? Would you do it again?" His answer, "I don't know." What? I don't know? WTF. Whaddaya mean you don't know? So you'd forego all of the completely life-changing good that is coming from and is going to come from advancement in AI just because there's risk to be managed? C'mon, man. what kind of answer is that? Advancement in AI is inevitable, inexorable. It is the ultimate destiny of humanity and this universe.
youtube
AI Jobs
2025-11-05T09:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw5xH8HMJuXuKnxndZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3_A4vq8vjZPMnmbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzYJDGT6O5DoD4HFHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNbn6rqAa_NmWT0e94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxA3hhvGvqBmH5CInV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugybg8StZT6mrRNbtm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgweGMCr3gLRJOpoFnx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwVv1Yi2gvcb-p_8wp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxSGv9mvfcM1juKyRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwysnHrmF5v-C4Ww_p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]