Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you so much for this - it really increased my confidence in using ChatGPT …
ytc_UgxPGpP4R…
G
I hate Ai so much. Its out of control. That Ai video is clearly Taylor made pro…
ytc_UgykM0hwL…
G
Sure. Things are so great in Cuba that millions have fled the country since the …
rdc_f9e6s7a
G
You're looking at around 300 hundred million jobs lost by 2030 to Ai , Robotics …
ytc_UgwdTZ7md…
G
There won’t be any jobs. Even farms will use ai robots to pick fruit and veges…
ytc_Ugz-gYyLK…
G
Creating, fantasy, escapism, limerence, and the chemical attachment. The chat bo…
ytc_Ugz93n3ST…
G
I don't trust AI to make a final product. I trust it to give me the scaffolding …
ytr_UgwVsS4Og…
G
Question, if I google an artist to get reference images for my personal studies,…
ytc_UgwWu34go…
Comment
Here's what vexes me and I'm sure I misunderstand him. Penrose seems to say that what we call AI can't be conscious because what AI is doing is obviously computable, else it wouldn't work at all; it's running on computers, therefore it must be computable. Somewhere in there, I get the impression that his point that "They (i.e., AI) don't understand what they're doing and therefore can't be conscious and they're not even intelligent," comes from the idea that consciousness or intelligence can't come from a computation--it's a non-computable problem of the sort described by Goedel's theorems. I don't know how he knows this. I don't think we know where consciousness comes from or whether there's only one way to manifest it or many ways, or whether ultimately our own minds arise from algorithms our brains are running, or...what?
youtube
AI Moral Status
2025-05-16T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzthwna2IS3FD6X-J54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5GTAGzRVPZ0H5aN54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrptvKqDVlLwN7NWR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY5ABc24FjIFoTM3x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBad-NGvWgds83o7Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxFE72xe26qnFK_jx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzdn5ItP0YgfcN8hXl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVXS69ZTyif65Q2Xt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwm0WtLGRfFC7Gfv2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqlW7uU8ExXeYtV5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]