Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Developers have been saying these for years. AI is a black box, it cant reason how they reached that code. It can give you a false explanation but if you ask it 10 times with same code. Variability of answer will be very high pointing towards non-deterministic reasoning/pseudo-reasoning. Next thing is accountability. The biggest thing that AI will never able to achieve. You just cant rely an AI to maintain whole codebase without human intervention.
youtube AI Jobs 2026-02-05T18:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyajkIBQnKyTbyGgt54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgznV1l2TT_l0B21g0p4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgykeLwZv7UF7Yvb-jx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx38NgWVsx5vCcRp994AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz2AW6gDTG1kdnJb_x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxChjBp4Ostg0053dV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwopgp1wgwPAdWkQxB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy_ikLXKWg832WVvxZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwSbUNbtBAkoafgvl54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxgwJrFOf-oOG1YRK94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]