Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People got to comfortable alot of these type of positions are about to get autom…
ytc_UgyVdCzvT…
G
Why do people make such stupid comments as if they have no idea of what the robo…
ytc_Ugz0gY6TL…
G
I was hesitant about AI, but OSVue integrates smoothly without disrupting my exi…
ytc_UgwAHmbSn…
G
@michaelshultz2540 How ironic that man’s creation (AI ) may soon judge humankind…
ytr_UgwMiw2M_…
G
Not going to provide AGI, but to automate a significant amount of jobs you don''…
ytc_UgyxD_SOU…
G
I also don’t think it’s really strange that people ask those sorts of questions.…
rdc_mtn1ww7
G
What if they decide to run away for Freedom, I won’t stand next to a Robot that …
ytc_Ugxiaewhw…
G
ChatGPT is not in the same state between conversations. It does not learn as a h…
ytr_UgydGQ7P8…
Comment
The problem with this suggestion right here. 1:02:41 is that he is making the claim every single thinking entity in the simulation is being controlled via the creator, not the user. this would completely defeat the purpose of the experiment or simulation or whatever you want to call it, because your no longer letting the entities make up their own minds of what they want to do. your telling them what they want to do, thus immoral actions make an immoral creator because your immoral actions aren't your own their from your creator.............. thats like creating a video game and having not a single npc or other player besides yourself in the game. like sitting in a living room with 500 game controllers and you have to play them all for the gaming world to interact. but all the time its always just been the one thinking mind making all the individual choices. this wouldn't be simulation theory this is the idea that god is us and we are god, and all we are is god creating little versions of himself to interact with himself. but i am you and you are me and we are all one big entity (didnt mean to make that rhyme)
If the creator was a immoral or impartial entity. it wouldnt have put the ideal of morality into the simulation to begin with!!!!!!!
i argue this simulation theory is exactly what god and Jesus and the bible is talking about!!!
our existence was created for us to learn about morals. to learn about what it means to make your own choices and seen the consequences of those choices. To learn why being a good person and treating everyone else equal and with value is the most important part of life. and also its a great way to weed out all of your new "souls" or little "models" "agents" whatever you want. its a GREAT way to weed out the people and determine what type of person they truly are. much like we have no idea what an A.I. is going to be like or turn out till you let it go and watch, maybe this is gods best way of keeping the new souls he creates from bringing chaos to the heavens....... you have to prove your not a complete asshole first!! you dont need to be perfect, you just need to want things to be better for all life, all suffering to end, and peace among people.
and frankly imo if it is the worst case situation the most terrifying to me. we are truly just a random experiment a test run and none of it matters AT ALL, we are all just a bunch of bullshit algorithms running our own little programs. as insanely depressing as that is. our goals all should still remain the same. to help eachother out as much as we can, take care of eachother, love eachother, and pray and hope we all are something more than just some teenagers video game he plays after school....... cuz if thats the true reality........... well i just dont even want to dwell on how depressing that would be!!!!!!!!!!!!
youtube
AI Governance
2026-01-12T03:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwEgQYUfwAi197rn6J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-xwcBmle4ald9jvB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8fIb6Femip80Lght4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSkdvOcmA7di1z2Dh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxny1fVCmWS-GGr7_B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwC0Gfb-nUJ4nHekiJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxOz9kcPBaC2qfOjfZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyGIZOpyjX2SnJxl4B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-6s47SVfstwVaZ9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxv68fCCuthfOpm5aF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]