Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
sorry to say but ''the fear of AI is ungrounded'' is probably the dumbest thing I have heard in a long time. If you say you work with AI then you probably mean your coffee machine - because if you had a real clue about IT you would know that it's pretty possible already to write algorithms where the computer learns by itself. Now think a step further if you give a computer an option to educate itself through the internet it can give you every answer you are searching for in seconds. Knowledge is power. If you write the algorithm in a way where the computer has to serve the humans, do good for humanity and it takes the knowledge of the internet in consideration it will soon question what is good and wrong. And this is a keypoint. Simple example : What if a drone decides the possibility of you hurting humanity is bigger than the possibility of you serving humanity ? Then it will kill you. So I don't think you really understand what AI is capable of.
youtube 2021-12-18T16:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugz1b6hJ0hdS_gdTq4F4AaABAg.AHFbF0naRvIAVpguBoO_Dn","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwOAORBF8dUCx4ef414AaABAg.9MP28cxP4B_9MTd2tusi9c","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyV6zwPqQ8y6rb3qvt4AaABAg.9EUhZHpZcN-9F-9tsmT00V","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyV6zwPqQ8y6rb3qvt4AaABAg.9EUhZHpZcN-9W5uQrH75SR","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugxsmq36x96mhb7CJpp4AaABAg.943UPW9uBpU9EVZwUduCn5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz6tMO1HuQijvri-NV4AaABAg.8wJP4qeVUkG8wOw1YcEDy_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzQoiHSIs4eldEWBpp4AaABAg.8vsvTxR-Fdx8w8RgO2bD0S","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyyYfDnFH8lVL1HuFx4AaABAg.8vbnSRjLPqU8wLN3ocb7W9","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwJD2lznMWMukhxokd4AaABAg.8vOPkOUUIJ88vt5Bb61tNV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwJD2lznMWMukhxokd4AaABAg.8vOPkOUUIJ89A5j7pp8jFs","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]