Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I 200% agree with this. People won’t just sit around starving and broke when the…
rdc_nppa6i7
G
The statement at the end from the spokesperson reads just like it was ChatGPT ge…
ytc_UgzGqHMzA…
G
CEO's are next. These people are so stupid, they made themselves obsolete.
Wha…
ytc_Ugw_2nWk1…
G
@crowe6961
i guess, but its not just artists who know less
most tech enthusiast…
ytr_Ugye_VaVq…
G
Well, the Godfather of AI is saying a plumber is the safest job for the short te…
ytc_Ugxr88BrC…
G
Yes, AI is going to automate virtually all jobs. But I don't think UBI is the en…
ytc_Ugxq8B9DY…
G
The cpu is designed for AI like a brain for animals to think. Robot be in 1960'…
ytc_UgzUmqVo_…
G
Have a talk with your children before something like this happen. These AI nowa…
ytc_UgxlFzmEI…
Comment
AI isn't going to become conscious anytime soon because that's not how LLMs (which have adopted the title of AI despite not fitting the classic definition) work. They can't "learn" anything. They can't form opinions or new ideas. All they can do is regurgitate words from their training data in the most likely order. Which is good for sounding like a human, but not for much else.
I'm no scientist, but the way I see it, AI isn't going to be _real_ AI until we can make artificial neurons at a quantity near enough to us. Nature nailed this; cells which can form connections to eachother to create more and more complex ideas.
youtube
AI Moral Status
2023-07-12T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugy5VPTTYZYHTJ3TVlF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgzBkRuYyFfgSU_RHFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwnKpbcQoZGE4bNrCp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwVonILuww8ji8YONZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxruvYm2hmT_jeNAwZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyoinQEGNhjBOndaLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx_L3uTVDRQwev7bG54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzVSa1Eaj99x8SziKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzUCmL1Z7sfmFJMtMR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_UgxOmxuqOg7CM98Tv3p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]