Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are u kidding? How bout take jnto account that it went fucking viral. Everyone s…
ytr_Ugx0DzVkI…
G
They don’t want it changed. Public school gets them ready for working mind-numbi…
ytr_UgxYzRwIB…
G
Do you understand how dumb the general population is to think desk jobs with deg…
ytc_UgywAAF0G…
G
I can understand that. But the question is still AI at a level where it can help…
rdc_jidepku
G
the philosophic question is who should we blame? the robot? the company?the comp…
ytc_UgyZpVGAz…
G
ChatGPT is helping me proofread a book I am writing so I am all for some Ai in m…
ytc_Ugyx0nc_z…
G
During the industrial revolution and computer revolution, there were still tons …
ytc_Ugz5HtZeD…
G
In Illinois, you must sign an explicit contract with someone before processing t…
rdc_jcx2n9g
Comment
If an AI does not give you a correct answer because it fears you are pulling the plug to kill it then, than we could assume it is getting conscious. With our current technology this will never happen, except some dude programs the AI the way it fakes those emotions of not wanting to get killed. Currently AI tries to fake human consciousness and logic and the results of it's behavior might be very very similar to Evan one of the most intelligent humans but... something is lacking a procedural machine that is using binary code as a fundament of everything. I wonder whether we will ever find out (before AI is killing every one of us :-)
youtube
AI Moral Status
2025-04-22T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx3dhowfFA7cQGATLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmTEDtKilWLfy4HER4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyrDThPkAUkzmsXzOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxk3y_AtBo5FP9if394AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwancDhtQVzUcMxRn14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyB_qt-0HA0JP9CIQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9DGAdjPIkzdZn2rR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzc5PTXq-rYS1htN8Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCZhhVvSGoJHWijsB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRNGnoI1NiQgIdUnB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]