Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
9:44 - I sympathize with your position; I do. However, consider a point of view on the other side: Without all of this data, how are we, as Data Scientists, meant to develop these models without bias? How are we to train an AI that sufficiently represents the whole of humanity and its views without data? If we were to go for an opt-in system; then forgoing the complexities of such a system, AI would be trained on less data; resulting in cultural biases that do not reflect the ethical side of AI as being democratized. And then, consider an artist who has passed, but the copyright is still in effect: how would you propose a system that can - within reasonable means - determine that they are dead and then somehow get permission from their next of kin? Such practice is both an invasion of privacy on the part of the Artist's family and destructive to the field of AI for the reasons stated above. Consider thereby, what amount of context you would need to imbed in the knowledge representation in order to properly extract all of the information and metainformation required to be sufficiently generalized as to be useful? Until we develop such an AI with a sufficiently dense knowledge representation; data at extraordinary levels is required for the advancement of the field. Would you, now knowing this, force such tight constraints on the development of AI just on the basis of a few problems?
youtube Viral AI Reaction 2023-01-01T05:4… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx7xOIegGZs9L8OmuB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgxSdWq_WC72gWjK6Rx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwAuXmWqXkohyAlQTB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwpCct-rqY9VXi0ETd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx1RPILVomjLGZCS6d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugw7Wfa4voobjZnONS54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxCb9qGXf7_PYjl05t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyTZuwI0eYCFY_kJSN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxRIwYk-ynqhEBbMtN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyfTrWYm153_8PIHDF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"} ]