Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fear of world destruction during the Manhattan Project, stemming from a chai…
ytc_UgypgtsAB…
G
Meta and Xai will also fail to attract users because of lack of trust of the own…
rdc_mz0crfe
G
literally this! everyone pushed college when i was in highschool (graduated 2006…
ytr_UgxTsR3Eg…
G
The fact is that autonomous cars are still really far away (despite what Elon sa…
ytc_Ugzdyq7FC…
G
Thank you🙏 1:50:37 Again I can say these podcasts about AI are fantastic awarene…
ytc_Ugx65Qbv0…
G
this is like blaming the gun...its not AI's fault, it the parents for not being …
ytc_Ugz4T8uwA…
G
"They are scared because they know they are not capable of doing something like …
ytc_UgweQ6z3j…
G
People romanticize struggle too much. There have always been people who reject n…
ytc_Ugz3cxqRH…
Comment
LLMs don't "believe", they aren't "obsessed", they don't "try to convince you". They say what the most probable answer is according to their training data. It doesn't have thoughts. It doesn't think. It's not evil. It's a talking machine with no humanity. That's why we shouldn't trust it to do tasks that require humanity.
youtube
AI Moral Status
2026-01-30T12:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxkE4CDbpdflsqrv454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOQgVWGOJsFXpGcgh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzF2t3a69pSt0yqt_t4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMyiuRE_Up8yIvAY54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyj-aMAs0JL-h3ZFO94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhYFADDOWrSx5IRmR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx0uCPCITm6Vg-Nxdt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZwYTQSOvmX2Sqfep4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwu1hIYllPkyTgAdY14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyAAA-QqSUW1O_8Gs54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]