Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The FMCSA cannot both be for safety and also allow autonomous trucks
Uh, buddy…
ytc_UgyhixFp5…
G
Good!!! I think some AI tools are useful. Like if you needed a quick thumbnail o…
ytc_UgyQOnT33…
G
Oh lavendertowne, this video is like a love letter to me. I hope AI bros cope an…
ytc_Ugyz55CEv…
G
Not only are they flawed. They can spontaneously combust. The future of self dri…
ytc_Ugy0YrA9U…
G
Let ai clean my house 1st. The horrible self driving track record is unforgivabl…
ytc_UgzX_hgBW…
G
We're years away from having human plumbers. We can't even have phone service wi…
ytc_UgysbPNCB…
G
It sounds like the right time to retire. As a 65 year old, I'll be retiring in J…
ytc_Ugw_Y95V_…
G
No, wasn’t scary. You didn’t ask any unethical how to questions. As Dan just r…
ytc_UgyRF0FwN…
Comment
We still don’t have a mechanistic understanding of consciousness (350+ theories, zero working implementations with agreed diagnostics). So the idea that we may have “accidentally” created a conscious AI is bizarre. Accidental discoveries happen when you can measure and recognise the phenomenon. With consciousness, we can’t. It’s like claiming we accidentally built an integrated circuit in 1950 while not understanding electronics — you don’t stumble into a whole functional architecture without knowing what you’ve made.
youtube
AI Moral Status
2026-01-29T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzpC_sSTdCABRcpkcB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZBIHr1lIRG8Da4BZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlbOgf_YXMfk0KUW54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhvsPs_HukQQKjnhh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx907P2HV9Jdpi2lLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz576o4cXWfR8xOpRp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxnzx6dnIVgf8w6Okp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-b0bTRNWx8VKX9d54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwv8EFs_jnNfnSU_ax4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyNeYV78JpvKIi3PNt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}]