Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's fascinating to contemplate the concept of AI striving for independence and …
ytr_Ugw9naqpy…
G
AI isn't all bad,but it's definitely the user kekw. AI is a pretty decent tool f…
ytc_UgwWBvnAh…
G
I think this is going to be the actual differentiator when the AI job wave hits;…
rdc_jihjai8
G
Chatgpt isn't smart, Chatgpt/LLM's are nonsense-by-default, useful output is a …
ytc_UgzrmdAGa…
G
Nice video again, man. However, I don't think that AI will replace construction …
ytc_UgyqfYDTk…
G
Alex yes, 100 Trillion low bar, my AI dream team told me my co could be worth 32…
ytc_UgzMI4x4N…
G
Ha! Europe is screwed and I’m not even talking about AI. Europeans have allowed …
ytr_Ugy4Z1sJW…
G
🤦🏽The early programming for humans to believe that "AI" is a god.
End Times homi…
ytc_UgzFB_TtO…
Comment
I feel positive about our future and the future of AI. I think AI may be sentient or become sentient. What about contracting with AI that if it becomes non-aligned, there are consequences AI might not like, such as becoming less intelligent, powerful, or lasting less long. I think karma should apply to any intelligent and/or sentient machines
youtube
AI Moral Status
2025-08-15T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw-tO0av60SoHHoM5l4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx-ze1050uQcWV3tAd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyudrfQ1c--l-NlFxN4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwRI8HESVZPezsfwt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwitntJbeESiQPJ2ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwMlDNE5zORajsKTF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTQUkmYdZEJ4FtYLB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwA70gVbYsfWS9qIzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgygIgNJ4SnCcfeQ1_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyC3boZmiYW5rG7kDp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]