Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
they are right , long term => everyone replaced by AI, but they are way way way over-optimisitc in their predictions. Even Kurzweil himself was wrong : 1) Kurzweil said self driving cars will operate on the streets by 2020 => didn't happen. Reasons: AI not intelligent enough, sensors are very expensive,. manufacturing requires ultra precision -> elevated cost; 2) Nanobots in bloodstream by 2020 => didn't happen. We barely finishing with genomics currently. 3) Kurzweil predicted AGI will be reached by 2020, this was back in 1990s-2000s , didn't happen. We only got LLM (predict the next token + chain of thought algorithm) Bottom line: the trend is gigantic, but because of its size, it develops very slowly on human-level scale. And once human labor is replaced by robots, there will be humans controlling tons of robots , again creating demand for jobs because someone has to be responsible for them.
youtube Viral AI Reaction 2026-04-24T19:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxVrwdMmudno9rl0d94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy1KiiApipIQj7EHwJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzPK4SrqYzm9MDR_u54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzjFR7sW6GZ-1SXs-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzHM_OW1mUeGQPqLgl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxO0DAZEsnvPxz37d14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugx_0eWvHTwSSXmJBjt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyUCCA6h15zOZE2xip4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"sadness"}, {"id":"ytc_UgzOqYD1S9P_HiIqVUl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzZEMcN5P4sYBzlFMt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]