Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
7:39 this is essentially fantasy at this point still. We have the llms and we have the advanced AI and Im not seeing the grand agi consciousness that’s going to figure out robot plumbers. We are still a century away from robot plumbers imho. I could be wrong of course but I don’t see this in my lifetime, maybe my children will see it but I just don’t think we have the wherewithal to actually do these things yet, similar to Elons vision of seeding Mars, like it’s an awesome necessary idea that will eventually become a reality but in our current world Im just not believing (nor seeing it); the true advancement in AI. I’ve been surprised and shocked by technology though, so i of course could be wrong. What I see is progressively more superficial media intelligences sort of learning how to simulate reality for entertaining or interesting ontological reasons. Like I have always had this idea that eventually an ai will try to convince us of a fake war or invasion and everyone totally falling for it, and using this as a precursor for genocide or something. That’s a reality i do see in our lifetime. The sub-agi tier ai is not as profound as we collectively believe it is imho. Im probably way off on this one though i intentionally try not to follow technology and trends and stuff but it’s almost impossible in today’s world
youtube Viral AI Reaction 2025-11-05T01:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwJ6DX9yYqie2hR6Mt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwKhTYWtHOhe123pGB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxTPgvUrEoRFOOz0it4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwpP7UVbNpkmMPxW5V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy1pLJ0ohbhgsTvZWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwMYapNlCQR-9XWcK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyc5Z9pTtNLV8fNki94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyl3UWEj-b1cI4xVU14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxaOn6hZiGxY2SAw1l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgyEphxVsoJvvEV0Z9x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"} ]