Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's interesting to see what chat GBT can't attempt to answer as well. I was hig…
ytc_Ugzk6PPrO…
G
For everyone saying driverless cars is a stupid idea, I've read the Air Force ha…
ytc_UgwN45SLy…
G
Human traits have always been undesirable, though. That's why we invented the co…
ytc_Ugzb9ScCw…
G
Hi, disabled artist here, thank you so much for bringing up how shitty it is to …
ytc_Ugys93Kwy…
G
ART IS DEAD, SO IS DESIGN. At the end of the day, the only thing that matters to…
ytc_UgwMf1JkB…
G
Wild to think some people are so dense they can’t tell the difference between di…
ytc_UgypcDtZC…
G
It's already been determined that facial recognition tech doesn't work on people…
ytc_UgxT6nfmi…
G
@totitelevisionshow while I get what you mean I do think your confused what you…
ytr_Ugyo6hc7s…
Comment
This is a hard one for me because these are two of my most influential “superhero heroes” I look up to you in my life right now. However, they are pitching something incorrectly, which is that these models could be getting consciousness sometime soon we don’t know, we just do not know the answer to whether they can or not be conscious and to say I don’t know is the right answer to this in general.
The second thing is the Turing test is not something that has been passed in my eyes, genuinely ask yourself what is the hearing test? What am I testing for and do that with the model and ask yourself? Is it actually passing this test right now? I have seen Claude do that for me multiple occasions, but it still has robotic features that are baked into it design that make it feel extremely robotic and it is not Turing test worthy
youtube
AI Governance
2026-03-11T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugxwp7VugAwzIR2XeMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzywSBDetSAodsCwRl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8DQEiyL27Sdde_kF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxj3ndGIgaZWmdsZal4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8q8zeNqWMxFSZX8Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHkkXiXpN123sNKqp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyqeTPYZCpIQjsRPxh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLxJtksDafJuNcLol4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLugHlzM-UyPrzwxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwU_AcGOkVlpXHsr014AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"})