Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Maybe YOU will not, but your descendants might. Maybe 50, 100, 500, 1000 years in the future, there might be a) AI supervisor (robotic or virtual), b) Wetware neural implants that prohibit unauthorized/dangerous behavior (regulating hormone levels, etc), c) Bio-engineering methods that accelerate human development to the point that humans reach adulthood at younger age (skipping from infant directly to adulthood). The more imminent danger would be that schools might become obsolete sooner, for example through homeschooling becoming the norm, with AI teaching from basic stuff to PhD level material. Schools in general were created so that during the Industrial Revolution Factories can train young people as workers to work on factories and follow the protocols of assembly lines. With modern-day automation (self-repairing, self-replicating robots, AI's coding AI, etc), from a purely economic perspective, national mandatory schools might be seen as "wasted money". Nobody can say for sure that these things will happen with 100% certainty, but this theorycrafting is important to see whether we need a different economic model than the one we have now (which is based on national debt, mass production and human labor). The bottom-line question is "What would humanity do if the majority work comes from automated processes and not from human labor?". Would companies/nations/AI overlords provide benefits (salaries, pensions, schools, medical treatment, etc) to the average human if human labor is no longer needed? What would such a world look like?
youtube AI Governance 2025-11-10T19:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgxDLEl410evmW62E6B4AaABAg.APSY3SH9z-lAPS_t1MPDko","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxYSAUQ7Arg8YMQOI14AaABAg.APPFtesUuODAPTGV7EYirt","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugza1g0TKwiydEbC-Cx4AaABAg.APLpz4uVL86APM1CLlXW8q","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgyRZR2E77yVGXLg55Z4AaABAg.APK9VkYS6YgAPh85zakVGK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxqO-tifqVYc1c11Bx4AaABAg.APHhCyCa0AJAPMI_2hJsS_","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgziHEjTd-yaZZ3L8ah4AaABAg.APEycBQvp-0APhAbJ_1WKl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_Ugyw7aN7GHRbH6ri3Ix4AaABAg.APDt7jOpbSNAPDtd8LQQMe","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyw7aN7GHRbH6ri3Ix4AaABAg.APDt7jOpbSNAPh8VDJyV13","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwnzT7KOBqnhFN3Rv14AaABAg.APD5Qhqg9MBAPPh4hrN4Hq","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytr_Ugw4_mBry-LfzJ2bU-14AaABAg.APABdiVnp8_APGxVgpTXe2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]