Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
DONT WORRY YALL there humans with special effects to look like a robot but ai ha…
ytc_UgymddWY1…
G
my opinion is that art was always meant to be purely subjective and shared betwe…
ytc_Ugw3MFVzr…
G
Art is a lot like musical talent in that people assume “oh you’re just so talent…
ytc_UgxSUP9f3…
G
Wild to me 4,100 people disliked this, and prolly because lavender is poisoning …
ytc_UgxWSMvCh…
G
There's a saying I have to these corporations. Who's gonna buy your sh*t when ev…
ytc_UgzgQVxHB…
G
I would reckon AI will replace some jobs, not all. It all has to do with reliabi…
ytc_UgzdwTFU6…
G
"Blah blah blah - we don't care what the general population of this planet think…
ytc_UgxurhrF5…
G
Tech bro here. While I do think AI does have some uses, for the most part, AI is…
ytc_UgytuYXhz…
Comment
It baffles me that Dr. Tyson is so nonchalant about the risks of runaway AI. Nuclear physics also offered huge promise for energy production, but missing one detail on cooling rod design nearly wiped humanity off the planet with Chernobyl. And we had a far better grasp on nuclear physics then than we do on how AIs work now. If anybody would have that kind of example in their back pocket I would have expected it to be Tyson.
There's a book called "If Anyone Builds It, Everyone Dies" written by 2 of the longest-standing researchers in AI-alignment. I highly recommend reading it to understand this isn't just alarmism.
youtube
AI Moral Status
2025-10-09T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDtupO9bmltIr2M7N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwR8n1RS7C1QEWDnYx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyNiygHMsonXzIEeuZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRxdFpGrBn0NrCX6J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwnlU8EL3XRvzkVP7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxyF_zMoS82yaMyy694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx7IWMhFVEWmx_oxDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzYf5000temkNKkiWB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxz0Uj7CK4Vtqf3rih4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHeNwep2Zfve0OQ1V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]