Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Elon is so full of shit on predictions... "November or December of this year, we should be able to go from a parking lot in California to a parking lot in New York, no controls touched at any point during the entire journey." - Elon in April of 2017 "I feel very confident predicting that there will be autonomous robotaxis from Tesla next year — not in all jurisdictions because we won't have regulatory approval everywhere... we'll have over a million robotaxis on the road." - Elon April 2019 "I think we will be feature complete — full self-driving — this year, meaning the car will be able to find you in a parking lot, pick you up and take you all the way to your destination without an intervention, this year. I would say I am of certain of that. That is not a question mark." - Elon 2019 "We’re going to try to fly the Red Dragon mission to Mars in 2018... It’s a 2-year cycle, so 2018, 2020, 2022, 2024. And I think if things go well, we might be able to send people in 2024, to arrive in 2025." - Elon in 2016 "If you say, 'When is it likely for the first time... for humans to land on Mars?' I think... five years is possible, six years." - Elon in 2021 "We’re going to be in volume production with the Semi next year." - Elon 2020 "We hope to have this in a human patient by the end of next year." - Elon in 2019 "Based on current trends, probably close to zero new cases in US too by end of April." - Elon in 2020
youtube Viral AI Reaction 2025-11-04T19:3… ♥ 26
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzpMnvuwCoqbYqE5fV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzx94udIMKyIB8NfDN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx60pNQt7BSt36Crgl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzIxr1SNbStayxuRDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz26P0YN-agPKLETwt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwbFLZmQKsQFLuW1At4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzm7APUwWjAPEzh35d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzQduFcGQxmP4LQytd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgybUwNBPm3IzJmcXER4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyvPWtENGRrgOZ5KDt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"} ]