Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One of the teachers at my high school was saying how ai art is basically the sam…
ytc_UgyKgIlj0…
G
Sorry to ask, but who developed AI? Aren't they some of the largest multinationa…
ytr_Ugzxmaj1i…
G
Explanation: The ai being practiced on by the other ai was programmed to act lik…
ytc_UgwjUV95f…
G
AI AI AI AI AI AI AI AI AI AI AI AI FEAR FEAR FEAR FEAR!!!! NEXT!!!!!!!…
ytc_UgwPTUY4U…
G
Every broke again... not going so well... robot going to control the world 🌎... …
ytc_Ugzg0LCs-…
G
i really liked how you connected Zapier with Google Sign-In and then used the AI…
ytc_UgwWlOrrW…
G
Anyone with common sense would know it’s a scam the moment a code is involved. I…
ytc_Ugy6JRlRz…
G
if you dont realised it by now: There is already a billion dollar industry in AI…
ytc_UgzzohmA-…
Comment
It is nonsense that we do not know how AI works. The system is designed by people and based on code, algorithms and data, the model learns.
If people decide to delete the model, that is what will happen and AI will not stop them from doing so.
As in the case of fire, it can be a good servant and a bad master. If someone decides to connect it to a nuclear warhead launcher or to experiment with connecting it to living organisms, it can turn out badly. But it has happened before with other systems..
youtube
AI Moral Status
2025-06-05T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxSQPxEYtntpecc9sV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEa_RlENsTK9iR_pN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxmg4I2da2p0XTYUNh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyO11tcft7STt5v1Zh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzvKo34F2yt-m0rKVB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnDSTbK7YwicjLetJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyU-Dj6-R-fasSOkf14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwfqikkRdG6M-lf9ml4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywwvKGgnTUBNTmW1N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwPkNCn9pXAsXa9O5B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]