Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I found that if you have ChatGPT open on your phone and you do the same translat…
ytc_Ugwot4ebf…
G
You better stay in your basement when they release the first robot its gonna com…
ytc_Ugw1h5_12…
G
3:15 "we don't know what the ideal worker skills will be." Managers will ask AIs…
ytc_UgzW-iJ5N…
G
Wow, incredible work — next, maybe try convincing Siri to confess her sins. This…
ytc_UgxZVs8Ww…
G
Get rid of these archaic leaders and persons in power
She should know it’s ai f…
ytc_UgxZb4sgN…
G
If someone is thrilled to have AI do their artist job for them like writing joke…
ytc_Ugys0ie_-…
G
Maybe even politics. When will AI run for president? It may have a real chance o…
ytr_Ugwj6MGkc…
G
Ai is built around pre defined conversions to provide context in answer only bas…
ytc_Ugw7caiC4…
Comment
Neil’s Optimism Feels a Bit Outdated
Neil deGrasse Tyson is brilliant and always fun to listen to—but his take that “AI is just another tool like the car or computer” feels dangerously optimistic.
Yes, past tech revolutions replaced jobs and created new ones. But AI isn’t just automation—it’s cognition at scale. And while technology is evolving exponentially, humans aren’t. Most of us aren’t becoming exponentially more creative, imaginative, or adaptive.
Telling every ordinary person to “just be more creative” to survive this shift sounds like asking everyone to be Olympic athletes just to keep their job.
The danger isn’t that AI will replace everyone—it’s that it will replace enough to reshape the fabric of society, while we keep telling ourselves it’s all going to be fine.
youtube
AI Moral Status
2025-08-02T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzx3cehIRJTdobB30V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvQqgYpSTcjUxNwdB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzr_4Mx4l_orioo4Sx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxuu1ybIY83-94zH5h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwrs60cRfTBY18oqqd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz414nl5nZ1-nel3S94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzKY4a2mku8wUtcg9F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxj9fL6RR8qSAjNOHF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKSbolSkvwsFbIxh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGjooVHvMiVBhXAMB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]