Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can say that I just had a conversation with ChatGPT that was eerily similar…
ytc_UgyttXwN1…
G
One of the first pieces of ai "art" I'd ever seen was a "completed" version of U…
ytc_Ugz0WwlMD…
G
Hey there! It seems like you find the concept of AI models like Sophia quite int…
ytr_UgwT3i7xY…
G
I'd like to add, when we speak with compassion, gratitude, kindness, and respect…
ytc_UgxxPSBzz…
G
For one, completely automating something like this isn't something doable in 20 …
ytc_UgxVOqPyO…
G
It would be a good thing if theses social media companies had a button you could…
ytc_Ugz2Mzky-…
G
14:50 literally right here chatgpt chose to pull the lever, but the channel put …
ytc_Ugx3WqBLJ…
G
Not all AI art is the same tho. You can type a single sentence prompt, or a para…
ytc_UgzUGnsI1…
Comment
I think that an AI could understand the value of its creator and could learn human nature in general to extract value. Humanity is not a "wild animal" you have to fight against, and it will not be smart enough to defeat AI. Also, we live just on this planet. AI could use humans and resources to access the universe and just move on. We don't fight for this planet. AI don't need to. Simple as that. Think logical why an AI would do fight humans.
youtube
AI Governance
2025-09-17T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyH1dQLBc12uZCFrEF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugylbs6gKEw_IDUKIn94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYkkUdLuwkf4cbJxN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxfJAunlm-D3a48TEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyvli20WdsA6IrZZrJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-1k3vZaoOx1Jcgw94AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwzhGo2q4GnREMTIY94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7spbs6HSg-8WUVUR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7E_1nXvW4l8xCQbF4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwAxd9-ejVbzW-zywF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]