Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@HanzDavid96 You're assuming that the AI won't hide it's behavior, which we've a…
ytr_UgxMdSyxB…
G
It's a computer. All you have to do is shut it off. This isn't Sky Net.…
ytc_UgzybPFNC…
G
So cars were meant for luggage, ai is meant for inbreeds. Am i right, you people…
ytc_UgxyQI-JV…
G
I can't stress this enough: AI art feels like an insult to real artists because …
ytc_Ugwfyhzo3…
G
So the danger of AI is not the millions of jobs lost as large corporations roll …
ytc_UgwEtcZsO…
G
Well, it is true that we have to adapt, that's just a fact.
I won't stop making …
ytc_UgyNOHPEh…
G
AI is already doing brain surgery what makes you think it won’t be able to build…
ytr_UgxxqOkbP…
G
I’m black and shit in America we aren’t humans .. we’re bottom feeders.. not all…
ytc_UgzNUt5OT…
Comment
Robots will never be conscious for fucks sake. They will only be able to make us believe that they are.
They are not organic in any way, they are just code, however you look at it. They do not experience time/life/growth/death, hell they do not EXPERIENCE anything. They learn through symbols/syntax which is completely different to our cellular brain. Our AI is still weak even though apparently strong AI has been around the corner for decades... Most importantly we have no idea what consciousness means... after all it's just a word we use to describe our subjective biological experiences.
I'm getting mad now... how can people be so stupid? THIS IS NOT HOW ANY OF THIS WORKS. MACHINES ARE MACHINES.
youtube
AI Moral Status
2017-02-24T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgguZkakQ-aSIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiVYzcqK51muHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjAwYV7bNsWrngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiM2ON3rg2HuXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjoNG0qXemj1HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggnlcXdFfJjMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggQEebRU0W_WngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjFW1C80PQDVngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijAzfQFlDQrngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj3wix4Hs8P1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]