Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@janmajer4662 AI thinks for it self. Maybe right now it's a bit limited, but in…
ytr_Ugzsti4jp…
G
Tried a few tools for monitoring AI sentiment but switched to AICarma for its re…
ytc_UgyZaRN1j…
G
Sooo you're pretty much saying that AI (designed to be human like in nature in g…
ytc_UgwbmlzFu…
G
This is a well known facet of the way Chat GPT works. It uses a prediction syste…
ytr_UgwFGa7la…
G
what a person, holy moly, great speaking, pople excited with AI, god bless the…
ytc_Ugz4HkUWL…
G
I was a fairly early adopter of one of these models that was designed to help wi…
ytc_UgyZVYSs7…
G
What if ai becomes so smart that we would have to put our money into something t…
ytc_UgzA_2zA8…
G
You nuts
You forgot that you will give you money to feed yourself,
How you coul…
ytr_Ugy0qpfzA…
Comment
CNN segment with Judd Rosenblatt—definitely an eyebrow-raiser. But even in that interview, the key concern was about alignment and oversight, not "nobody knows anything." Saying AI creators don't know how it works is like saying NASA doesn’t understand rockets because booster stages are complex.
Yes, AI models can behave unexpectedly—especially in large, open-ended systems—but we do know how they're built, trained, and evaluated. The black box metaphor applies to emergent behaviors, not fundamental ignorance. There’s a huge difference between “complex” and “unknowable.”
youtube
AI Moral Status
2025-06-19T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwBx8HvUHgzHzN44sN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9zbJmlQ_wR1w3sER4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwFBnOeTRPQbHlHX0V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzbqHLjs2m7F6coetN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6wigFo0zYR67aXyF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygVKRxNvQYCtOFADt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx8Z23IQrvqj5Kb7dV4AaABAg","responsibility":"media","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyU7EiTctd0ficQJKd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzE7_iB3co8xy3Vhst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxp6xHQQQsNFQkFQXl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]