Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
All a bunch of bs. AI isn't close to being the boogie man that you guys are selling it as. Half if the companies that went all in are trying to hire back humans. AI doesn't really understand anything, it basically guesses the next word based on the current one. That's why you get random " hallucinations" and real people have to go back and check the work.
youtube AI Governance 2025-10-27T07:0… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxlHsx8ylzUKrUH3xF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx7DWLZWjKXPdPP_BV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxiKwWq2gF-Xx2tmhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxIlb2rOrOHjRTgtVZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxJHlk0vWPqsasnpgR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz4LM7_wo6GFfUMLBV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxNiqAqbuzPs8jjqQp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwpamAfkcipmSiDiwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwvHlVyhvehgn2g6Rx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyeHonGWu9BhP3Rr1N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]