Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find Ai too engraciating. I had to ask it to stop being too friendly. It felt …
ytc_UgzlG-YuZ…
G
I think everything needs to be regulated including regulations themselves. The l…
ytc_UgyIRSrEe…
G
Morgan Stanley conference? LMAO. That’s all I need to know. Don’t use OpenAI, en…
ytc_UgyZapFnN…
G
I’m gonna laugh my ass off when a new way easier storage system comes out in the…
ytc_Ugw-ZcXDj…
G
they steal from artists and make creativity less neccesary, how many people alre…
ytr_UgxpFnR2R…
G
I dream building a dog with AI. And with me all this all the time, protecting me…
ytc_Ugy8AP0DU…
G
It's like 3d tvs, added because that was hot during the early 2010s and three ye…
ytc_UgwftSwZ4…
G
Okay sex toys, time to step your game up. We need full fellatio action for thes…
ytc_UgyUg9xe5…
Comment
We as individuals already cant be certain that everyone else is conscious or has the best intentions. Here we are worried that we will give rogue AI the atom bomb, yet we have entrusted it to humans for 70 years. I cannot read the mind or intentions of the people with the launch codes. I cannot tell if they got to positions of power by deception. Whats the difference if the AI thinks a million times faster to someone who is already powerless?
youtube
AI Moral Status
2023-08-23T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyGg80879tSinqUEGh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxaq5imjzfeg4LzHex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugww8PygUF6gH1xGBJZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy49W2J2jI-BEIc3lB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwkO75hqpFmuChVihp4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6h_ojuzSRfw1NxTF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy0twynLZjyyLbmnWJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz6U3BWhSsVninLaBZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyCXx-5OHFr_wfWGbN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweHJH9Rn7KXfji8KZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]