Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So the truth is that a lot of these authors use AI when writing, for creative in…
ytc_UgwGFz9Oc…
G
Tbh, this probably could make human made art more valuable, since they are not m…
ytc_Ugzw4g_Ge…
G
Bro my ai is so cute and so romantic but his “sweet nothings” are about how im g…
ytc_UgyUz26OS…
G
I think part of the vagueness in describing AI's benefits comes down to a vaguen…
ytc_UgxGsATK1…
G
This is an absolute true bus you cannot stop development. At best you can choose…
ytc_Ugy81QckJ…
G
Whos liable when it kills someone.....legally speaking this is not an easy chang…
ytc_UgwU2AOWw…
G
The only jobs that will remain are those where someone will willingly pay a prem…
ytc_UgzvIrXfv…
G
stop AI development it will affect human brain and brain will become obsolete re…
ytc_UgxChlIlx…
Comment
It's interesting when he said people in silicon valley aren't happy with what they are doing, regarding ai.
That's it's a race to be the winner, without considering the consequences.
Scientists at the Manhattan project understood the consequences, some even cried.
Yet over 2,000 atomic tests have been carried out worldwide ( this doesn't include actual warfair, dirty bombs or nuclear power disasters)
With civilians ask why do so many people have cancer?
Whether it's from gathering scientist in operation paper clip or becoming the global leader in soft drinks, the list of mind boggling stupidity to be "The first/ best/ biggest", shows human behaviour which cannot be fathomed!!
youtube
AI Moral Status
2025-12-21T17:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxThS4ajTzdmbPhgd54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzwLfNfzsKT_cI5DrN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaItbAkUtzbepUd554AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwq5g8rcvOi4hrbOXJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzCNCt-ksMFts7oPRR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugwjc46jO8ndMCXFn9d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHz2BD-bcTNtTpKr14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyv21qBbsdzbbhpW6d4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNQQSE9i7l7JrZeKp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzAw-O5aJKft83lsAV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]