Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When you create a new species (AI) that is smarter than you, you lose control of your future. I went to Harvard for computer science. Then AI came on the scene, and people predicted it would be decades before we should be concerned. Now we know it's much faster and smarter than we thought. Businesses are going to use it as it doesn't take vacations, doesn't need health care, and the list of why it's better in all respects for a company continues. We thought it would take longer, but AI will soon be able to program itself. This new world will happen faster and faster. LLM (large language models) were thought only possible in 2050 or so, just five years ago. They started out being as smart as a high school student, new versions were created, and now as smart as a college graduate. Companies will be forced to use them or perish. Much like businesses taking advantage of cheap labor in China starting in the 70s, and soon American workers were "too expensive".
youtube AI Harm Incident 2025-06-20T17:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyD2lWZqHZy1dSQ7o14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugz_9iw7F5U6UvfUvyl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyZbzUIHwJMhoiFdRh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy17hjlShobN0rlaYp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzNo_ZL1K8by6V3yO94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy9GO5-0ZgxmfRulKl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyan9pTN0GLatkcbEh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzXL4YW8hAvverSSjh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxGL_Z5hmR_gqcBJw94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwvpZtoRXKuHG4NqVZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]