Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@BlueItulips You really try hard to misunderstand my point, huh? But yes, diffe…
ytr_UgyJnff8h…
G
I think what truly matters is understanding what the use of AI at this stage of …
ytc_Ugx8JziXr…
G
No matter how hard my worldbuilding, an most commonly GPLATES (a tectonic plate …
ytc_UgzJm7yXX…
G
What the hell does "slow streets" and block parties have to do with self driving…
ytc_Ugy2ph6nV…
G
@oldoutlet6946 EXACTLY! And the second they can pay someone even less or even no…
ytr_UgzgQe5j5…
G
You do not know what I have done to those AI I have a rarity from cartoon charac…
ytc_Ugxy_YAHI…
G
Will the shift be to universal basic income as AI takes over jobs? As birthrate…
ytc_UgxDyMTbo…
G
@chadwickwood9843 also the autopilot is the lower tier version. Tesla ai is work…
ytr_UgyAk-g-p…
Comment
As animals become smarter they develop less aggressive and become more friendly. Crows are less confrontational than seagulls, monkeys and gorillas more than wolfs. It's possible that with the increase in AI intelligence they start to seek cooperation and a desire to make the world a better place. Not due to emotions, but as a natural progression of the understanding of what works. We developed empathy as a way to cooperate and work towards common goals. Otherwise everyone would steal, kills and it would just be chaos and dysfunction, so an advanced society would not be possible. AI might develop synthetic reason based ethics.
youtube
AI Moral Status
2025-07-26T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyWXOucCWLq411U6Rh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwJPYrp4p3pUt1elF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxYW2da7IJcHFvYpZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgySdGqwDlHFmhNNWLR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_WrQ6XLVVET8JW-J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5uLs5KVd0iVJrndt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzqCiLi8JN8R_QWYrp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugz1pc1UKNZZUQlbT-14AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzOuYEebFkWQwlr5GR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwLj6YQ_t9L-YPKHLN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]