Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Owning an AI model that is either cutting-edge or kinda sorta open-source, you'll get pushed out of the market if you charge a price and can't keep up with innovations. Entertainment. Human interaction will always be king, it's more authentic. They are just tools, and they will remain as such forever, I'm 99% sure of this, people need people after all. Abstract jobs that can't be properly taught to an AI, but this will fizzle out over time, it just won't happen in the next 5-10 years so you're pretty safe until then. Security. AI have systems that are innately vulnerable to other AI. People introduce human error that AI seeks to replace, but we can't be screwed with in the same way. They'll always be vulnerable. Innovators. Scientists and the like. We're not exactly developing AGI yet. We'll be needing innovation for a very long time. They'll be replaced eventually, but I can't give a timeframe, it's too far off. Likely not within your lifetime.
youtube AI Governance 2026-03-02T19:3… ♥ 6
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgxjL2hbNVlFRppXxSZ4AaABAg.AV0hqT-lAGnAV0k-ITFz7U","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxjL2hbNVlFRppXxSZ4AaABAg.AV0hqT-lAGnAV0ktY7EaY1","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugzr_dGza4U624ENI3t4AaABAg.AUOkMo84zuCAUOvTNu3P00","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxH_SBMuCxyMuigWuh4AaABAg.AUOeovR6QmnAUP21DLg9I4","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgyOwyV98Nfe3smjXtF4AaABAg.AUNie2gvyycAUOsuy_79RK","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugz0pPGuvZfQN61xP694AaABAg.AUNavLZQgdEAUOzCxq30cr","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_Ugzn53Cj5QRmGewX4bp4AaABAg.AU8m26SnstaAUBezDUmN5A","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytr_Ugyd2ax3Q0c21Fthnsx4AaABAg.ATfDj4zlqFFATriCZ6E6lY","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgzYg0IvYNNNsbQFuJ54AaABAg.ATcy0TUIQi8ATwcG6ej36D","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgzOLZDQI3Lgsu5uAed4AaABAg.AT_eT1oevZBAT_mWM01xqt","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"} ]