Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
really? AI "whitewashing"? do you even know what colorwashing means? is taking a…
ytc_Ugy1fzu2U…
G
Although his concerns are valid and what he said is correct to some extent, I th…
ytc_UgzpP8UJ2…
G
As an graphic designer (literally a HND college student), I'm all for AI. Is it …
ytc_UgzwKf_Qa…
G
time to regulate , AI tax , for every job you replace with AI you have to pay 50…
ytc_Ugz1hs50G…
G
Fr, when im talking to A.I im now saying "pls, and thank you" just in case they'…
ytr_Ugyr36-bm…
G
The godfather of AI has great concerns. https://youtu.be/giT0ytynSqg
He suggest…
ytc_Ugz_ts6cG…
G
Autonomous driving is inevitable...it improves everyday. But, will people still…
ytc_UgyE61DIx…
G
That’s because you don’t have full self driving. That’s regular autopilot. It wo…
ytc_UgxmCQd_8…
Comment
I can see super memory being built, but not super intelligence. Most of these so called artificial intelligence things have been terrible at coming up with anything new and useful. We aren't going to make a Newton or an Einstein, or a Mozart or a Beethoven, or a Shakespeare or a Mark Twain, or on and on.
youtube
AI Moral Status
2026-02-16T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDNvBt1RU1jLzrODd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1of0XxWW4F7u2CCF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwYFUHR-qCbaVRtRsJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZLx4xtalhf6Frrad4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgylJK7D6NYyjm6_Zj54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyNG5bdZbi3Q_NFcrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy0wEe9faydo4-wh6R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwNzkks9jFu5Hka_1x4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxP6zaYhHh9fPK_hVd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyEk3S4weaUjWW1zMB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]