Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well their explanation makes absolute sense, doesn't mean you have to like what …
rdc_n7lfpey
G
"Ai art is real art, allowing you to create amazing creations in seconds" nah br…
ytc_UgyjR0o5y…
G
Been telling the same for long...AI is good for the Corp., not for the people.…
ytc_Ugy-iFOMP…
G
I sometimes use AI art to visualize things I couldn’t normally, rather than usin…
ytc_UgwUCZ-L4…
G
I literally won a law case (very, very high stakes) using AI. But i didn’t rely …
rdc_n80cyp5
G
Carl Sagan, who created Cosmos and was a socialist, would projectile vomit liste…
ytc_UgwWdy2bh…
G
You legal people have your head filled with nonsense. There is nowhere on the fo…
ytc_Ugw27QKJu…
G
You guys have the wrong outlook on this think about it like this… a ai teacher c…
ytc_UgzXA_HOn…
Comment
Neil, I agree — science gave us some of the greatest breakthroughs in history. But after we solved the basics — clean water, antibiotics, emergency medicine — the focus shifted. Technology, big pharma, and the scientific machine started prioritizing profit over purpose. If we’d frozen technology and medicine at the 1980s level and focused purely on making those tools cleaner, safer, and more equitable, we’d likely have lower chronic illness, higher mental resilience, better community ties, fewer addictions, and a slower, more grounded pace of life. The potential of science is still incredible… but the direction it’s been steered in has too often moved us away from a truly healthier, happier human experience.
youtube
AI Moral Status
2025-08-11T23:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwPz9ygWheepYrI5894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxuxgtj9e9x7vcrBi94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxejfLC2zM6RmEhgG54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJJvnngYkWlGzFLUd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxgBUrTpJ764j8G8CV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy8QcdoXyKxj20YAFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgztAxPqB9JLunLtlm94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoeGnkvMaNMA5Xhg14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztlpjZFpEjSgwLnF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcSMkd4Ww9c1ZqFT94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"})