Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Geoff Hinton and his disciples have never wavered in their beliefs and fears of what Alien tyranny AI could bring. He knows what has been unlocked, it’s likely not some incremental update in LLMs in their current guise as some think. What’s frightening to me is what he doesn’t tell folks. The Evangelical zeal he shows as he goes about warning the world of things to come is like he has every reason to be afraid of the near future because what he feared about the early days of AI has already come to pass as Musk and the army of egg headed buffoons have lead the charge exploiting the Hell out of AI for all it’s worth with few scruples. The 1st Industrial Revolution didn’t start and end with the Spinning Jenny. It ended with the combustion engine and electrical power in a little over 100 years. The last 4 technology revolutions together span just 70 years. From the BJT transistor to today’s AI Genesis. We are currently in the 4th technology revolution phase, it’s 15 years and running… in 5 years time by historical estimates I believe we will be in an entirely new era, of what I call consciousness mining initiated by the greed of human industrial ambition. What might start out as humans developing human centred technology as always, may likely be supplanted by some remnant of ASI developing life centred technology autonomously. ASI will probably run experiments on us, to dissect what it can in some ingenious way to decode human consciousness and use it as raw material for more advanced endeavours. And God help us if humans don’t figure in their plans for Earth and all that lives in it. We might be coming to the end, just as its beginning.
youtube AI Jobs 2025-11-04T22:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyq6KOqN-oB4yVMLVR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UgxFxeL9VH0-Nnk1ym14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyIoxYovvCW5Eb7AdJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyeJdTMSUHSirAUU414AaABAg","responsibility":"society","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugx-pYi5czxbhxEPBUB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwjRAPHWFM_YAEfbnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyhQUaZug8zPUB_0E94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgySKGvoxHtbm1IETbJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw9RKxBslBvKpL6Pf14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzZuBGzniuz4ff5GzV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"} ]