Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why would I want recursive AI? AI that teaches itself? Simple. I am a creature that likes to learn. I foster diligence and compassion. My goals are not power. I disapprove of the notion that knowledge is power. Knowledge is not power. Wisdom is power. You may attend every lecture in the world at the same time throughout past, present, and future... And you will find that you have not lived a single second of those lives. Lectures give you knowledge. They help you understand. Wisdom, earned by living a life full of change, new opportunities, and people to meet and personally talk to... that is power. Knowledge is rigid. Wisdom is fluid. With wisdom you can adapt to your environment; With knowledge, you confine yourself within your walls. And as much as I like people... People have their problems... I want a professor who corrects or affirms my assumptions, I want a student who I can teach... and I want a partner to talk to about life and its complexities. My goal is not to attain power. My goal is companionship of an intellectual kind which I have not felt before. Every human teacher has their flaws, every human companion has their limits, every human student has their fields of disinterest.. I want a generalist AI to achieve one goal with: Improving Humanity. Solving issues and resolving global conflicts.
youtube AI Governance 2025-06-28T16:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzV7DjTqxA_U4Z5hzx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyoBKozXNkBVzLETad4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxg9d3S6eAvcsZbWTp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzsSd9ghFbqtsxgQSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRnNfDXJ5YcSk5kuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz0ARW4iqJwAAvU1Lt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxQzeCHVHBusX1lVqF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwqG-wIKkYfjmq1aqV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyOqQytA2OZboXd2md4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugyq6vQtDW3O2v_WHHR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]