Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
11:16 well no one's going to have money to prefer another human do a job for the…
ytc_Ugws6mCfw…
G
As a character artist I also really REALLY hate ai art because every time if I’m…
ytc_UgzRZdCz1…
G
it's not like big pharma, training and fine-tuning LLM doesn't require a big com…
ytc_Ugx5cuC2G…
G
If everyone doesn't have a job because of AI, who would buy those products made …
ytc_UgzVwMQc6…
G
Humans should set the goals for AI. Humans needs to regulate the success of sa…
ytc_UgwopVP_Q…
G
You may refuse to develop autonomous weapons, but be fully assured that your ene…
ytc_Ugx4RFKv5…
G
Im an English tutor at my university and the amount of students whose end of ter…
ytc_UgwAD95Gp…
G
Agree with Dave.. "it's clearly nonsensical to give AI personhood because perso…
ytc_Ugyq-wZA5…
Comment
I think the biggest controversy is that any kind of illiterate talentless monkey can now somehow make art that would take an artist hours or days of work. It takes honed skill and talent to create a picture even if it is inspired by another artist. A robot taking already established art and mashing em together is effortless and meaningless. Art isn't a matter of efficiency, it's the expression of human emotion. Without that human element, it's just a bunch of colours in a pattern. Art was never about detail or difficulty. Some of the most famous paintings could be replicated by a toddler, it's the emotion behind the art that gives it value.
youtube
AI Responsibility
2023-01-27T08:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxrF4ULyB0bBQqD2Ix4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0b7fdSXAIrdprY1t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyAUulvFKBgaPWFb14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwDmWZWLq_OWZw_zJ14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwzowcmrmh9mJWot0J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyC4RWXUn62QueWxvR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgyAQJwxJlpJJIRLzaJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzi8c-25JMWExiKDKZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz38t50VIhtnAWF-kt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxwne1Z9okY2w11WYZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]