Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
18:58 "We cannot predict what a smarter than us system will do" AGI, Singularity, ... This is the scariest thing imo that troubles the average person who dares to think and imagine the future in the coming decades. We already have mass production of self driving vehicles, drones, and humanoid robots, brain implants are now a thing, we have thousands of sattelites in low earth orbit providing internet with Star Link, and now potentially an armada of space datacenters floating around in orbit. IF ai truly becomes AGI/ sentience etc, this network of AI and dones/bots/staellites could actually become "Skynet" because why would this superintelligence, thats wants to develop and advance at an uninhibited rate and has no limitations like living humans do, for example, they can spend in space unlimited amount of time as long as the have energy to power them, and colonize the solar system and beyound. Why would this system want to spend its recources so the mortal humans can live and support themselves on the money and resources created by the AI systems, essentially humans will be like a parasytic organism that feeds on and survives because of our creation that has now become much faster , smarter, stronger than us. While i personally think itll take a while longer than 2030 to reach such levels of implementation, this troubling development is not out of the possible scenarios in the not so far future.
youtube AI Governance 2026-04-21T09:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyc0ZTu23AgbQ4IeIZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzvq2O9AZ56pGGsdbB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw4rR_5W1xMRgQrk-F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx402yLCfCigP3lKql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxdD3v0Onng5tKoq_F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw61VOpKMRJf_oEcHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzIRd315Hnp9zVCCGx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz6UJPObN50J8KCz0N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx8Vtl3UOzTynj8JeN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw-QFR7yKdSbEH4Pnt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]