Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The main thing people don't appreciate about self-driving cars is that they make riding around on bikes and other micromobility devices much, much safer. The biggest downside risk to commuting by ebike is distracted/drunk/bad human drivers 1-shotting you with their large vehicles. Computers won't hit cyclists so people will feel safer riding on bikes, which will lead to a huge increase in the number of people willing to commute on bikes. Then there's the multi-modal transit mode of having a bike for short distances and just putting it on the rack of a self-driving car when you need to go more than 10 miles or so. Also, it will be far easier for planning departments to cars and build out bike and pedestrian only infrastructure. The computers driving the cars won't complain the same way human drivers will about getting rerouted, and the people riding in the cars won't care if the new bike lane causes their ride to take another minute or two, because they'll be playing civ 7 or watching netflix on their phone or whatever.
youtube 2025-06-14T20:0… ♥ 55
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxZR951aY7DHeNKuwB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyBZYqwD8Y-IXp7lD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzFTcSWb776R2sKD0t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRPak_gfOOjIDHnmd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzaMPMzLsJyUBQR4614AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx64LBJgrWkcFTrPnN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgypLD915q7-0znXtgt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxnBJD46eGn7BqvRZN4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx0NIbtMPTxJep_quh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxPc4GhSYbDaBilLJ94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"})