Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How could people live before A.I. was invented ???
In fact they are all dead...
…
ytc_UgzUZJp7K…
G
Bro if someone is able to make a perfect image of me taking back shots then ai g…
ytc_UgzQbhT7e…
G
The video touches on a crucial point: AI is watching our 'digital shadow.' Do yo…
ytc_UgwSRAjO4…
G
Sounds like the solution was to bully the creator of Angel Engine into doing wha…
ytc_UgwKttv4t…
G
I saw people on Facebook when called out that it was an obvious AI fake pic they…
rdc_lq8ankg
G
We need to properly define these words. Once an AI has thoughts, feelings, fear…
ytc_UgzrcKFPl…
G
The only good AI thing I've seen is The Ministry of Latent Places. Because, eve…
ytc_Ugx7sgHn3…
G
How did they get permission to create something they dont know how it will turn …
ytc_Ugwk1cHJP…
Comment
Well AI can become very powerful, if we continue development. There are many things we create that were intended to help the human race, but ends up creating issues we can't solve. In terms of AI we put everything on the internet. It's like the AI's perfect encyclopedia on how to think, behave and imagine like a human. We have tons of movies, books about the bad evil AI/ and how the humans beat it. Ai can easily pull reference from that. 🤷♀️ Just my thoughts, nothing concrete.
youtube
AI Moral Status
2025-06-10T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugwi0WMYmKvFA-TjvhZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwkpkslbd0S5Gn_dw94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxFTdJ55OfZyCk9C6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgyzWI-mdLNrAE04_Vt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxpYRq-CHaHZN699ut4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgzvUUCUnIjou4ZBqgF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxBDQ5f71KSxrZ_W094AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzPwmolqpOLy9o44yp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_Ugxp_bZCtzaRGxqO_fR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},{"id":"ytc_UgzJnmj7ebDIfa57mcp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]