Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Remember 10 years ago? They said we‘ll soon have self-driving trucks. They said …
ytc_UgxHNohhr…
G
Russia created Aircraft that looked very similar to American Aircraft. For examp…
ytc_UgxxAJxRC…
G
AI barreling past the is/aught problem was not on my bingo card for this Hallowe…
ytc_UgwyxEUdI…
G
Environmental Impact:
The water usage by AI systems, especially in the context o…
ytc_UgxPK_-LA…
G
The economics of AI is greatly flawed - If AI and Robots create mass unemploymen…
ytc_Ugy2YmTVJ…
G
unplug. simply unplug. turn off cellphone . go biking hiking. something.. …
ytc_Ugwux3WNO…
G
Keep in mind AI is very energy hungry. ChatGPT runs on a massive warehouse full …
ytc_UgxChX60-…
G
this, the people who signed the halt ai for 6 months so they could get attention…
rdc_jfbke8m
Comment
As someone who's working on AI algorithms for his PhD work, when I see Hinton saying that he suddenly realized this or that after so many years working in the field seems to me more like a way of him saying he's recently seen something profound that caused a huge shift in his thoughts/expectations about the nature of AI systems and what they can do, and it seems that it scared him which might be an indication he's not telling the whole story, or more aptly put, the interesting/scary part about it... signed an NDA before leaving Google ?
youtube
AI Governance
2023-05-12T07:5…
♥ 342
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz9QSfJH8BO7_0HfrN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzdK9RowD-QjAL1h4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzo2VP3HKOJNpOn3Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw17CXthwiuZDpG2bR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx0mCUjC33Y9x0EHmJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugw0LgMQoei9z38xeRh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzFdUDxUGJ79DAAUBN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwrLNu8DICIzuPlARJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzpH11BQ-HRDLIkxq54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwdI9cgvLSrBd8gbqx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]