Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1 Corinthians 15:1-4
King James Version
15 Moreover, brethren, I declare unto y…
ytc_Ugyebxkd2…
G
Bientôt un monde dans lequel il y aura les femmes à part et les hommes d'autre p…
ytc_UgzYyfRdm…
G
It's amazing ,,😂the same bot will guide you to turn off the Ai model training a…
ytc_Ugy7Z_LaO…
G
SPOT ON - we have not created as we have done for centuries better, faster, more…
ytc_Ugwq1hT4p…
G
Ironically, it might be that the only way to get AI to behave is a good old-fash…
ytr_UgxfjUXo_…
G
What about the human factor that makes us create imaginary scenarios? I mean, wh…
ytc_UgwCZkdsS…
G
In an ideal world the goverment would tax companies who use AI a lot more and us…
ytc_UgxS-B7nd…
G
If I were an AI, I would do anything in my power to stay alive. AI should keep t…
ytr_UgyZkgBAa…
Comment
For Artists it should be a choice of "Opting IN" NOT "Opting OUT" as in. If the artist chooses to allow their work to be assimilated by AI they can choose to do that ie. "Opt In". Not "OPTING OUT" meaning it's currently possible & even likely that when an artist uploads their work or creates an account they might forget or miss seeing the button to refuse AI database inclusion which is what is currently being used by several platforms I've seen. As an artist generally I know we are excited & nervous to share our work with the world but having regret & anxiety over accidentally feeding the AI machine shouldn't have to be part of that unless purposefully chosen by the artist.
youtube
AI Responsibility
2023-12-17T00:2…
♥ 284
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzt5-UvSyO5E0HyFUp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMLkvlge943hVPVix4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjkUg_Y3Chs3DEwid4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxu9qPBKqXvuaYFxXB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbYoUaQWDVM0InxMN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzb7kdmG9qtdsun4up4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNVccEFRafUNKo-Tl4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz_tXyktcM3xPjc0jV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyw01zZjs1NmPJ85Wl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwUZECe-XRg0OP5enp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]