Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is a slightly modified version of the Battlestar Galactica stories, which is effectively Mormonism. Realistically, we're not even close to General AI; it’s likely to take at least 50 years, but almost certainly decades longer than that. In the 1970s, Dr. Christopher Evans made a load of predictions for the year 2000. None of them came true at the time, BUT they have become true now. He was 15–25 years too early. If we consider 2025 as the year the last of his predictions came true, it shows his timing was simply off. It is unlikely a General AI is possible, but personally, I wouldn't bet money on that. Processors can only get so small, so there's a real chance that we hit a brick wall and technology progression effectively stops, or at least slows to a snail's pace. Or if, stable quantum computers do become a reality; at this point, there will be an unimaginable amount of positive progression, making the future completely unpredictable. I say "completely unpredictable" if quantum computers do become truly stable and reliable, but that isn't strictly true. We know there would be great progressions in medical science, for example.
youtube AI Governance 2026-02-27T13:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugy_BxWIzW48C8tOHlB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxlIwijgiYmoUYe0VF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwU9brjyXaQQB8chgp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwGCn0Yy-d3VhSR-mB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugw7Rq7fChMg0dtZZFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxFYhSLIVkY6Dlu3oh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw_PFy4UVHHuKdFu5l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxPylUV2bS3_0LUCg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugws-9lw50vSnMNX15t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugw4pNHPNMzXrF4T7Wp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]