Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
uh hi yea, usually AI is good(like barely) at landscapes and city scapes... i somehow created something so complex no ai can think of it... probably because it doesnt exist so there is no training for it to know anyways great video! oh right, this cityscape is getting made by a friend of mine and so far its coming out great!! i also have a book of my own that is being put on hold to get this one project started! Also i feel like i should go on a rant, hi yes im a history nerd aswell as a writer. Stop killer robots wont stop robots from being used in war. But i can garuntee you one thing will happen. Eventually wars will become entirely based around robots. Thats right! No human death or any death required since everything WILL be autonomous. Its gonna take a minute but thats the natural step in which will happen and has ALWAYS happend with war. And sadly war happens but the stance we should be taking is STOPPING HUMANS FROM BEING USED AS NUMBERS and START USING ROBOTS AS THEIR DESIGNED PURPOSE. As a number. Im not pro-AI im pro-survival. Id much rather hear that private johnson got replaced by a robot because that means that we just... got better... human deaths are no more yes that soldiers job is being replaced but speaking from a moral standpoint? In a war that is faught where we wage human lives nobody wins. The best thing we can do is have a system where there is a definitive winner and have no death. The only way to do that is to have robots/automated systems fight those wars for us. Because afterall. War is meaningless...no war is hopeless
youtube Viral AI Reaction 2025-12-12T12:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwQUYrHpF7k7dExGhp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyqAInUEmVxBkCfQQh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxPdnh8n_Yplf7DEWl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgwQvg1e0tVteDRUjdR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwB8kSZQWuZNKlA6Vd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw4bWieJnTA7kqTVsV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzq6Xtv0lFy8pB_WKd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxV-z7wVnrOJ8m53aR4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzhIIcJlOxz-nBbLtZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzZG0QNYxU2DimHlX54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]