Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This should be enough to stop open ai all together. The government isn't here fo…
ytc_UgzRKvPI_…
G
Too bad hes wrong there is no proof that agi is even possible yet but we can be …
ytc_UgyBPtnIm…
G
Random black guy minding his own damn business.
Ai: we need to watch this guy.…
ytc_UgzxNfRvd…
G
so many bananas, at least 10x more than the iq of the common ai users…
ytc_Ugw4qbaVx…
G
Install Powertoys and you can rebind that copilot key to be the right Control ke…
ytc_UgwSDk05M…
G
As an Entrepreneur, this is a dream come true; levels much the playing field. Ma…
ytc_UgzpMY2HC…
G
and yet as a media designer trying to get back in the industry after burnout bre…
ytc_UgwE5amkl…
G
Elon Musk and Sam Altman do not know AI; they are investing in Machine Learning.…
ytc_Ugx7mqzsQ…
Comment
It's up to us, listeners and consumers to choose "Human Made" to deliberately support human artists in all fields. All of us who are worried or upset about AI taking our jobs and art away should choose "Human Made" regardless if AI happens to sound good or make something that amuses us. In the long run int's about our existence. Who will want to raise children in a world with no human opportunities? So, I hope there will be powerful organisations, or in lack of them, grass root organisations starting to label products "Human Made" to helps us make an edjucated choice: choose to support humanity.
youtube
2026-03-10T09:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyaNpvjiHlW-lT1eGR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvVPuhoNkFD-upzbp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyJSzBqj2ppNhtZCOF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwiOJEywARcsSTShAZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz_PmjSSswMGRLRnIJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3_c3gqHJyQlbwlPR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxS76GMXoaRhg8XlRZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxhYN53pdwnMUaXIwh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"disapproval"},
{"id":"ytc_Ugx7MC09axfgc5nIxTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxLpbhk0jGgcdCicMN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]