Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The people saying "accept it" "deal with it" are most likely the ones who would be the first to be loud about their passion, their jobs being autamated, taken by AI and in that case it wouldn't be okay, but if it's art and artists, that's okay to them because they probably don't see art and what artists do as a job or a livelyhood even, because for some reason a lot of people think that creating art is just a hobby of some sort and it doesn't take time, effort, energy and even money to make those pieces of art that they use to train these AIs to generate their "own" images. So if there wouldn't be these pieces of art they stole blatantly, there wouldn't be any kind of AI to begin with because there would be no data. So why do they think it's okay? If artists were to accept it and never create new art, then these AI would never evolve past this point in a million years because there would be no new data to steal. I think AI is an amazing tool to help artist in their work, but not doing it for them. Using it as part of the workflow is a cool thing, but just typing in "a cool image of a dog in the style of *insert any artist*" that's just as you said, stealing those artists identity. People should put themselves in artist's shoes for a minute and think what it would be like with their jobs, livelyhoods, passion and see if it's still okay. Substitute your own context for artit's and then think if it would be okay for you to just "accept it"
youtube Viral AI Reaction 2022-12-24T17:3… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugx1bZUkJp8OOurtZWp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxppGPwo-VvAlurt494AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzW69WGiiIbn43Xebd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyxutfwQ-BvkQzvVnF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzZHJMOraOnSH7cd4F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz2ZjuJDuH2lL3oYNR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"sadness"}, {"id":"ytc_Ugw7gHHMz2UTDmI0M6Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgyMwuhFrdAIUpin5v54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"sadness"}, {"id":"ytc_UgwwWncD3yrgmi3f56x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwQpWrBBZJ1u7NPF2Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}]