Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
these are guaranteed human, artificial intelligence do not even move their head …
ytc_UgyQ0qIt1…
G
There are many cool and beneficial things thst can be done with generative ai, …
ytc_UgztU9X0r…
G
This has been a thing for like two years, every damn company has created their l…
ytc_UgwDUYtwq…
G
Theres one difference between those other nations and the USA as an example.....…
ytc_UgxSznR3e…
G
The issue is, we don’t know what consciousness is. Maybe we’re just infinitely c…
ytc_UgzzkNF9C…
G
Yep. OpenAI/Microsoft put out ChatGPT and immediately started talking about how …
rdc_k0b6eqw
G
Top school in the world were interviewed noted that increased break/recess time …
ytc_Ugx4itPbd…
G
Thank you for sharing your thoughts. In the video, Sophia emphasizes the importa…
ytr_UgzP1SqWs…
Comment
The cool thing about the Fedex arrow? They got it to work in quite a few other alphabets, and in right-to-left writing systems the arrow points the other way.
The problem with Tyson's discussion of AI/AGI is that the subject is really not in his area of expertise. Astrophysicists are not huge users of machine learning outside of data analysis (though ML for data analysis has been an incredible tool); his explanation of AGI is nearly as bad as your explanation of "gravity and orbits". Which is to say, in the same town as the ballpark, but so imprecise it is utterly useless.
This is Tyson confidently asserting that planes can land with their engines disabled, but helicopters cannot. Why are physicists like this?
youtube
AI Moral Status
2025-07-23T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy7iTpAjBgvNFodIrl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3Mr6w--TYoQMzd7l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy5uL99BUDWUZSpt814AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVrHxzHO8zqTxr2K14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjXbz2wScnTE4A1jp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYndSuBgfmlaemQAl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx05Dskt1mjeZW_nup4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy0Phabqni1bd0Dd2Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzb7WsW2pllwxy0IXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9fg_nQKfLKYV_-DV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})