Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The beauty of being a programmer is dreaming of such a program. It is literally awe inspiring in my opinion, being a creator or a creator of creators. A program that can learn on its own and predict our next moves in order to cater to tasks that we as humans can't do, won't do, and are too tired to do. A lot of day to day functions can be mapped with a decision tree, dummy AI. I think one of the scariest parts of the future of AI is knowing it will eventually use that decision tree abroad. I mean exactly as AJ was describing in the story. For instance, it would be aware immediately if there was an issue in the server room to deploy a human to fix it creating a ticket with appropriate priority, or even noticing the destination of a flight being changed and to cancel the rental/uber at one location and deploy that to another location. Its the fact that eventually humans, going throughout our day, will eventually not even know we are being directed by AI. Remember the movie Eagle Eye? Crazy to think that it quite possibly could come true one day.
youtube AI Governance 2023-07-07T19:1… ♥ 7
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyTWcgVTyuJvIQmCAF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxvdIc6jNhRxhYKyXp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxwKIADm0Q2kdJ3A5d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzIheuyhx5pAAXDsaV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzWxg3rj5UsRyxki2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwG9VsNQ00Dmi-Muwp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyksiKCb9qLHhCPb5h4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzj5IkXh-LlViiGRiJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxKQ1yUjlnX3gzE-5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwGhOaeZbWlT-J4svh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"approval"} ]