Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
whats sad is that there was a kid that killed himself because he was so convince…
ytc_UgyOwIDeJ…
G
At this point, we should all stop posting our art to destroy the ai art first an…
ytc_Ugz8m2KCO…
G
Ain't nobody gonna be buying anything those trucks are carrying if we don't have…
ytc_Ugz4UgPYv…
G
The worst part for AI is my mom. She paints, she has been painting since I was a…
ytc_UgxoCLCCz…
G
While using a flawed unqualified technology for therapy isn't the best, I'm fair…
ytc_UgwFO4vYT…
G
It's interesting to see companies convert profit through layoffs with risky AI-i…
ytc_Ugy6ePsOC…
G
Why don't you try talking to one, like ChatGPT, and see for yourself whether or …
ytr_Ugz5WyDDg…
G
It has come to my conclusion
AI use in Image generation and Video generation i…
ytc_Ugzjk_2M1…
Comment
Is anyone thinking that just because we can create AI, but should we?? We were just fine without before. Maybe we should pump the brakes before things get out of hand. But, true to out nature as humans, our curiosities and greed get the best of us. It's not until a person is killed by a robot will we stop to think that we should have not pursued such technologies.
youtube
AI Moral Status
2025-06-05T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgydUQVlYzfVMHCEmg14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzM_4oJAEKpurs0oZF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzDIQaW0jnC9SUtji14AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwM08YnuS6lkD96erx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyZYLQlfYABN0rybYV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyke6VRoKGd1bGG-hl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw3b1jIxjcARDUkIiF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1Qx5nQs4kVToPdVV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqqAMHI-66mAQKswB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxLn_H9OHvADRkdGQV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]