Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
For AI provider TOS issues and also for confidentiality purposes, I think your best bet is using a self-hosted model, such as Stable Diffusion. Unlike the mentioned DALL-E or Midjourney, you run Stable Diffusion on your own machine, completely separated.
youtube 2023-02-17T07:5… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwjeBz3ygCH-9pEdKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWWu1hnkIRi-Hf7jZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwA_J5Y-IUeExXO3EB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzjv4Wrgy2ygl30yOF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgwTQbec3kssxU8aPGV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxvo7xukvjoeG0EQ4l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxR5sNJF4z56NETyFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzOWzmfuBPHOqSH23t4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy1S6aLCXd87DlVjEh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxkWx7p0mLHFcuwmYp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]