Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If I sat you in a box and demanded you stay perfectly alert for hours which doing nothing. You would fail. Demanding people stay perfectly alert while in a self driving car is ridiculous. This technology does not work!
youtube AI Harm Incident 2025-01-19T14:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzZ-0lUFQrxdAMS9Bp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy2UX64H1Mo4c0vgt54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxApkn0WMMqvL5pWX14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx0swZ4kh5iRug0c9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw0DYmkwE5ehsW0JoN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyyJWaVDd20D2MIxxR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxGMAWY7O7_4eg2v_d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz3of2MQmWnLP309o14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz7uO5xSoYu8S5aAnB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz9qbULdEK1QWoqTih4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]