Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm actually not so good at drawing, but I still learn how to draw humans, build…
ytc_Ugz764DHY…
G
If you're trying to learn AI or onboard it into your company, you're throwing yo…
ytc_UgwtRNn2a…
G
@Mmmmkaaay producing cars is the most advanced and standardized industry in the …
ytr_Ugw2U_RQt…
G
People who don't understand the difference between the hand of the artist and a …
ytc_UgwPoTcvN…
G
Yes, wahin yaad aaya yeh dekh ke
Not a doubt, dating apps will now use AI to p…
ytr_UgyrX3CfZ…
G
LLMs are garbage, they're not intelligent, a useful tool at most. The bubble is …
ytc_Ugz6gmioF…
G
AI will not end humans because it would need to feel we are superior or competit…
ytc_UgxkeyJO2…
G
22:17 think about the people that used to make clothes back in the day and they …
ytc_UgzJq44kM…
Comment
This guy's thinking is deeply flawed in at least four ways (beyond already pointed out by Sam Harris). (1) There are powerful incentives to view the AI's as people, including economic ones and personal ones (maybe this AI is your lost parent, or is a real friend, or people want a doctor / teacher / therapist who is a person). (2) People can believe AI's are not people for reasons beyond this guy's imagined need for people to want a slave/master relationship / economic benefit; maybe people have different belief systems about consciousness than this guy, like intelligence and consciousness being different properties. (3) This guy pretends like consciousness is a solved problem (actually Sam Harris hints at how ridiculous this guy is when he mentions that the Turing test was not a thing). (4) This guy does not realize that if the robots actually are not conscious but we treat them so, then we will be replacing conscious life with a husk.
youtube
AI Moral Status
2026-04-06T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxd-16xAhgIrJXWzLJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxgCWSW-eClYf6NIPZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5mppCvlbj-HDaEBh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6CT7hF5hLGFa-0mN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjBEyNbxEmyuNRhgR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwz3JiMAKtryQWqEiR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxuFhVcfbKYwQqv5E14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAZ_LxCu3tiF12E9l4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZXjdSS7kd3w6LMSJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyEhROkEKS8Esph1v14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]