Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Almost 90% of jobs were replaced within a generation by machines in the 1900's... (and it didn't cause 90% unemployment then) Economics just boils down to how much energy you have to put out vs how much you can earn for it. You can express this in Calories: 8 hours of unskilled labor cost you about 2,500 Calories (upper case 'C' aka "Food calories" or Kilocalories).... >In 1800, the money earned could only buy, at most 2500 Cal in other labor. >But in 2020, you can buy *at least* 1,116,000 Calories in the form of gasoline. ....granted, much of that benefit is lost in the conversion to goods due to regulatory redundancy), but the production of many many many goods is unaffected by this cheap energy....and that is what automation will improve, from a "whole picture perspective" Automation will just mean shorter work weeks for those who want that, as things will cost so much less, and equally more wealth for those who don't. Most future jobs will be more human-interaction based, also. Service industry, etc... you'll see alot more life coaches I'll bet, for example ..
youtube AI Moral Status 2020-01-18T09:3… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw0eXvk9V-YJWrbzoJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzVLMEz94l8SAhuCM54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzi_dJ8gdJv5EvwG4F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyWI4ARrPKwvY4itDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx2vmj-GbuBgNPPQ-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy3P2ZRdSq8sv9DuAF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzTj24blwxr6LJGSVJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxjnWR8VmeEi2_SETZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxDDxzsqBFSTYY-UNZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzMOKciiMcAyhcP-gl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]