Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai learning art is nothing like a human learning art. A real artist needs to lea…
ytr_UgxoGh5-Q…
G
If a person can specify to an ai exactly what they want and get something out th…
ytc_Ugy_LQEQH…
G
Need fragmentation 556 round for improvement, and a Lidar/radar/infared auto AI …
ytc_UgwWhhw98…
G
Stephen Hawking was a respectable individual. Elon Musk is not a respectable in…
ytc_Ugxh-qCOd…
G
The dog does not know that you are going to work? He knows who you met there, wh…
ytc_UgxjE9bJ6…
G
My fellow dudes and dudettes, don't talk about this and ignore this company beca…
ytc_Ugzi_nQVK…
G
I'm supposed to trust AI in the hands of this man... fml we're all doomed…
ytc_UgzlNFCdU…
G
Your questions about AI’s impact on the workforce answered : by Ai - Content cre…
ytc_Ugzll0BuA…
Comment
From a logical stand point if robots/A.I understood that they benefit from us (meaning life and support meaning we can repair them if needed) and we benefit of them (like calculations, education etc) then we'll be able to make a truce and a warning, one side goes rogue then the other side can stop it, A.I goes rogue we can shut it down its about forming a line between war and peace that we can make a deal between both sides, I could be wrong but who knows
youtube
AI Moral Status
2018-09-28T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgztiI5yrRajqwF9Z9V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzj5RgNyYKfMsAWaVl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFBAYDXj1Eupo7GBt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugww-LRZhApWZHzGTZJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzaPRapbiZKyBSwVX54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzHNSMTIK_28zvvBth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0n1V_4zZk3YVZXdd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwHQwU08jJ2-N9Y-GN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5PkNVdtMueJYCzpF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyThO8bFVYwQ0oTQRB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]