Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The dangerous part about Ai is when = 1) replacing people jobs which 2) increases unemployment rate which 3) makes it harder to secure limited job positions due to even higher competition which also 4) reduces people buying power 5) hence reducing the country overall wealth and economy 6) also lesser tax income per year due to higher unemployment rate You may not see it now but when every company uses ai to cheapen labour cost you will see its effects. A factory used to have 100-1000 workers. When they all go back home they can use the money to buy groceries at a grocery store to feed their families. When ai and robots replaces them then these people became jobless and have to find a job in a company that doesnt need ai or else they wont be able to buy anything. The grocery store also suffer a huge loss of income from 100-1000 factory workers that used to buy from them everyday since ai robots doesnt need food or drinks to live. People say they are still engineer jobs and maintenance for ai available but how many engineer and maintenence you actually need in a company for ai is it 100-1000 engineer? Its way less than that and people should know that.
youtube AI Responsibility 2025-10-21T09:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxIOZLOxY-GRpIgQCx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz_oiTVbw_7hlbgjIt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyHsEDv_4G8jtZELUN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxBJFswHeLhHVQlRbZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyMuG-wfKbCyploR094AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyjHlv2F_MD-yrEOOZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw2DJVW-x4B0_uRqIp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxnQXagNBkJE4Lt9MB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy8UB62AzhagVscmdd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwpmNHjAx9i21L8Bs14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]