Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait, the guy says he thinks it’s possible to develop AI safely but you don’t as…
ytc_UgzemThq4…
G
@spacesciencelab So then, what if you were born blind, and only your eyes were m…
ytr_Ugwr7qI4w…
G
Hey Bernie, why can’t the Democrats address SOCIAL SECURITY!? You all have been …
ytc_UgwyjIbvB…
G
Most "standard" ML algorithms are non-parametric, in that they do not estimate p…
ytr_Ugy9ZK0SW…
G
Ai is mostly loved by lazy people. Let the computer do all the work? Yeah, tell …
ytc_Ugy-Agin1…
G
2024 : are those real robots or human
1980 : are these humans or real robot
If …
ytc_UgzU9ECfO…
G
He will use all the necessary supplies to go to Mars, there won't be enough for …
ytc_UgzMVzkMY…
G
@contrapposto4636 Hahaha I know, kinda sucks tho. It does solve things and is s…
ytr_Ugx2tieCg…
Comment
Similar to synthetic products that makes us unhealthy and don’t feel good, AI will offer synthetic services. Humans will always need and chase natural products and services. The biggest service here is human connection.
AI will never be able to provide quality, only humans can do this. Think about hand made products: why is there demand for hand made, slow, natural, old-fashioned products and services, even though machines and automated factories can achieve those products?
Same thing will happen with AI.
Humans will always chase their origins: natural, slow, real, good quality, human interactions
youtube
AI Moral Status
2026-03-22T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyuMEArU-IALZlTEk54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwT3TlMHg0RVDi1ob94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzQz7as8SQsMHCoBNJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyp57mdTtIX3LPWngJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzLTd_wt4IA_2tilzB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwIVC3DWCIS_xmQiy14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz3Kyk2inz-AwBoOzN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZPVGTAKSH_A3YsjJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugw0pY2y-lhtbDnxd6Z4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4Hgl9GvcYfq6MGV54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]