Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Umm I honestly do think that robots are gonna take over the world. If you belive…
ytc_Ugw2HzMhs…
G
Is it that simple, I think Deepseek is the one that identified itself as ChatGPT…
rdc_m9fp29f
G
Also Rick Rubin, Yeah, I could probably use AI to make another shitty Red Hot Ch…
ytc_UgzERrSlR…
G
Both are not AI because I saw my drunk ass friend did the same thing as Clip #2…
ytc_UgyvPhrGU…
G
Don't connect your nuclear weapons online ever then humanity will be safe but if…
ytc_UgxE0SGSu…
G
They don't understand the consequences of their actions. Probably because they …
ytr_Ugz_oYWad…
G
The earth will run out of water before AI gets that big, because the genius’s th…
ytc_Ugxn_M5Hb…
G
AI art is fabulous. Imagine people with creative story ideas not being held back…
ytc_UgyH_enHA…
Comment
Neil De Grasse Tyson once said, "What ever effort it takes to go to mars and terraform it, it would take far less effort to fix Earth." Why hasn't he reached the same conclusion about AI? Whatever money, water, and energy it takes to train AI and reach AGI, it would take less resources to fix the education system and develop smarter and more creative people around the world? People have become obsessed with AI. It's tiring. These AI type videos are getting way too much airtime and views.
youtube
AI Moral Status
2026-03-01T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxMUKhvT2axm6zay-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwYT7SqIQREwO_nf9J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy55HRusgII_JqtZ9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmYA3w_Qs59dN_y654AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzAkfq4ft_jyJa2t54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOl62x8kwtxGngEU54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx7tiWXl895cx2tfgZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw9WPHHmrgMbtn9UnV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxmDxH5b-VFniKDwFt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx3Q27nZJB4-FBWHrp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"}
]