Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"People don't need to work..." in certain industries -- white collar jobs, especially entry-level can be and are being reduced quickly. But someone still has to manufacture the physical capital mentioned -- even if you had robots that were developed and then AI to build newer robots, all of that infrastructure takes time, and still human resources are needed, not just computer upon computer. And in our capitalistic society, agree there's a concentration of the early pioneers fighting for ownership -- example, Meta's attempt at VR worlds trying to own the space to get ahead of anyone else, but spectacularly failed (or maybe still being worked on but no news as of late). The gap in haves and have nots will continue to grow before we can get to a utopia/Star Trek world where we work for purpose rather than money/modern living. Humanity would suffer before we can actually get to peace. I certainly don't condone getting rid of it all to return to nature, but at the same time, a cautious approach is needed to ensure it doesn't create the inequality it likely would if allowed to run rampant. A balance between excitement of the possibilities and the fear of quick change that negatively impacts populations. Start with energy consumption -- if you can't manage that aspect, even AI won't survive because we don't have enough energy to support the rest of regular life.
youtube AI Governance 2025-08-03T15:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx_ol9zdBr3S4n3n6N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwQckF60wgNJEY-5y54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxYJiNbPMLSq-KFMvl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx-tkzcd2QELpVdav54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyWT7oLxRkMDhCrqLl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMwbK71tm0OVg311Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwBcQNyM4ieGCsTLsJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgweE9lcC8FHIYUW6mJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwStHNNqt9FDrwghbZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzywjcoY0JgGy2JQ6B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"} ]