Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great to think of it like this, but the world is messy. Some driverless cars have been programmed to value the passengers' lives more than pedestrians/other drivers. They would willingly run over a kid in the crosswalk if it meant that the car didn't have to swerve into a pylon to stop in time (potentially injuring passengers). Frankly it's terrifying to think that AIs will literally decide which humans live and which humans die.
youtube 2023-08-01T13:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzLlQ-VJ37w6AkNULV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzoQg-B1A3ftW-LnOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxOsg6LzU6jHjYJYFl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxoP_L0DxAesFn225h4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzUWfs3o37XJeSpTet4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzGGesYeDpNJBUZxDl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgxNeABeEEMW_EeMQaR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxgwqSAjcKAWJCayMZ4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"amusement"}, {"id":"ytc_UgyYHXOsZeB-IgfmUKx4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx2dbSuGCMa9mHuve14AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]