Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yall too terminally online if you think this "ratioing" is doing anything but dr…
ytc_Ugw_NlJ7b…
G
I don't know that programming will even be significantly impacted by AI at all. …
ytc_Ugzdl2XKy…
G
For that to occur, a good % of people would have to lose their homes/be forced o…
rdc_gkqavwi
G
To be honest every time I say something nice to my AI it starts to blush and I e…
ytc_UgxORsVGy…
G
If AI and robots replace humans, oh well ,that's evolution for yah.
Do the evol…
ytc_UgwCjOJ06…
G
AI is only as powerful as it's real world agency, which is still nil even with f…
rdc_l5u05jg
G
Hunter Harris. Huh? The problem is when the video says all what they talked abou…
ytr_UgwT3t4FS…
G
They’re nearly self driving actually! Most tractors / combines now run on sat na…
rdc_ksksf2p
Comment
I think the dangers Elon Musk didn't really want to talk about are AI-controlled state police and AI-controlled mass surveillance.
Imagine Stalin's totalitarian hell, but instead of corrupt people (that you might be able to bribe, or reason with, or pray for their compassion) you have an emotionless superintelligent machine controlling every aspect of your life.
youtube
AI Governance
2023-04-20T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgysLscr16tq0tu71Q14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwrlL1eNSre2Pda3754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyPZUNbAdFsVIpEPwp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwg2tav6471ERAg7eN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7uWq6XWzD3Ruv5NV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugypk6WeTtl_k5PMbq54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxVUnZc_HAd9RWgqpl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyO56CidKyonctrUcZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRo1B2I24pGiljDs14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBsMMlv_oAY--WBcF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]