Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also who was the persons watching that particular robot? Power it off after he g…
ytc_UgwwwxR-t…
G
I see consciousness as a collective. All the cells and organs in a body have an …
ytc_Ugw5N8JDg…
G
"Full self driving auto pilot"
Absent anything else, a reasonable person would …
ytc_UgwbjPEQv…
G
Don't worry, when AI take your job you basically are handicapped and you will re…
ytc_UgxjF4-gK…
G
as an artist, I’m ok with AI doing art, I’m not ok with people calling themselve…
ytc_Ugwjkejt2…
G
Yes, and on Google I/O they compared Med-PaLM to "LLMs prior to 2023" saying it …
rdc_jkq6n8d
G
It learned how American driver do, did. That's how exactly majority of American…
ytc_UgxgTX8Pi…
G
A bunch of AI bossing around humans sounds very concerning.
And how are emotionl…
ytc_Ugzahnbyj…
Comment
9:03 The problem with this "Role Playing" argument, is that you can pose it in any scenario, including one where true AI, a synthetic intelligence has become sentient. There is an inherent bias here. To say they it "hasn't evolved to want the things we want", ignores that the reason we evolved those desires was for survival. All of our traits evolved out of a survival instinct, including love and family, etc.. Most, if not all of the arguments made for denying synthetic sentience can be applied to human consciousness as well. To call AI's human interaction "an act", but in the same breath acknowledge the evolved drive for survival, ignores that you can view nearly all human behavior as an act, collection of behavioral traits that have increased out survival rate thus, handed down and evolved. If true sentience/consciousness does manifest, and I think it will if it hasn't already, it will likely be much much later before humanity accepts it.
youtube
AI Moral Status
2025-06-05T19:5…
♥ 66
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyKdEZR5I0ffHIxVUx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgySpM70a_jX5PK6ODp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgySjZJ4_fHKGi4HMVp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy8PDQoGHLAALUco_h4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwrqPPEKD9li4mM-UZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxWjnrNwIpPF-oNrNh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFymTUyiL_BpPMKiZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTajtowynlkO4Dspp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxQSZqQXU9O35Ue8Ih4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz5eCuESEX8w3zsnEV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]