Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I feel like you kinda missed the point of some of these posts. Or they were just poorly worded. Idk. Anyways I've got my own statements/questions that I'd like to get some feedback. 1) Imagine that you modeled a figure or a miniature. Now you have two options. You can 3d print it or you can pay the woodcarver to make it. The carved figure will obviously have additional value because of all the effort a real human had put into it. But 3d printed one is also cool and it was almost free (if the printer was your own). Same thing with AI 'art'. You can pay an artist to make something and you will be able to appreciate their talent every time you look at their work. But you can just ask AI to generate something. It probably won't look as cool but it's almost free. Note: here I'm comparing craft (woodcarving) and art. It's not really fair but I think you get the point. Note 2: also you can still appreciate the human effort put in AI art. The effort of the person who engineered the AI itself. 2) The ethics problem. Many artists complain that AI steals their work. And I don't get it. No, really. AI learns by looking at your work (oversimplification of course). But so do human artists. Beginners look at work of other artists and learn. If AI is stealing than literally every human is stealing too. Note: of course this logic only applies to artworks that are free to look at. 3) Now a philosophical hypothetical question. Lets say in the future there will be AI that interprets neural activity in order to control prosthetic limb. Would a painting that was drawn by robotic hand controlled by a human via this AI considered art? AI gets 'promts' (neural signals) and then draws on it's own (kinda). I'd say yes because there's more than just an idea coming from a human. But i'm curious what you think. P. S: sorry if my english isn't perfect. Not my native language
youtube 2026-02-23T09:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwUwwmP2RVqY4L8A2x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx0SeyjDcwJkN6TaTt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz60J3r5xxCTgK6z454AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxqgX0So_aarTHvUk94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw05I1yzs9LApmCITZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzN07s-1Tt7h0c-dsp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugygi9qjmKDlnBgpK814AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwmlDTAATiJ6T4M1714AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx6b_9F-kPD5eqPzFV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxu7mJnVu8bUXOWkbR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"} ]