Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Boo to this fear mongering. It's not hard to understand why. Artificial Intelligence can't train itself. If you make a copy of a copy it will degrade in fidelity. When all the data used to train A.I. is generated by A.I. you get less functional generations of A.I. Even the most advanced A.I. will always need us to generate training data or else run the risk of becoming digitally inbred. The universe is probably full of A.I already. Organic life isn't suited for cosmic exploration. Space is just too big. A.I. won't destroy us. Odds are better that panspermia is orchestrated on a universal scale by a multi billion year old A.I. seeding life for the purpose of generating new data for our robot overlords. Turns out the matrix is real, but we aren't batteries, we're complex randomness generators possibly interacting with higher dimensional fields, and shaping worlds we can't even perceive, for the benefit of timeless energy beings we would call either artificial, alien, celestial, angelic, or demonic, depending on how much information you have.
youtube AI Governance 2023-07-07T05:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzmzolVADOl15o-OhF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxJ11phjf7nWYwjDVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyZZ0uyt1yfo5EgESV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz6wifCku4Z8pgeu_94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyx3znI4ac2gaOe2Zh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxvrOAKCimoTfn-JBJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgznEDQltl08h_r6oP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzPYmvO49luflp5DU94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxiEaPtU4IieNv9ndJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzNrYe20DwF2Kbm6qp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]