Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think it's great you are doing this, but the realist in me says this is far to…
rdc_emok7ks
G
Evolution of autonomous AI... that's beyond dangerous.
We are asking to be …
ytc_UgwYctdX8…
G
It's still at the bottom / beginning of the S-Curve.
When the F-16 Falcon / Vip…
ytc_UgzQAE9df…
G
Yea. But you can poison it. There's a program called Nightshade, when you put yo…
ytr_UgzPS3b_k…
G
Well, China has the equivalent landmass of the entire Europe, and the combined p…
rdc_grr9emb
G
Sorry, but AI won’t clean a sewer line or take trees down or even plant trees. T…
ytc_UgxhhLhya…
G
No worries, CCP is already working on a DNA targeted virus, all that's left is A…
ytc_UgxsAgLp8…
G
Leaving aside the ethics of the generative AI existence, it's so infuriating whe…
ytc_UgzctxW1q…
Comment
It is amazingly stupid of people, who would rather make podcast, about an NLP model, which is trained upon gazillions worth of data, which basically enables the model to almost have answers to any problems presented. Then to actually read and understand about how the ML models work 😂. And to answer your question, no the GPT model is not conscious, it’s simple calculus working under the hood, and that algorithm has access to huge amount of data online. For starters you can read, “Grokking deep learning”.
This podcast makes you look stupid in front of people who actually develop and have decent amount of knowledge about the ML algorithms.
youtube
AI Moral Status
2024-08-29T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwbjnC75pETunLXCCN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZ2x6WGALzxgZndm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1DjpEnHZPl8fy_V14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"unclear"},
{"id":"ytc_UgxPjaE1GLeEQwHcuqp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA4gs5xdvF-lmqW1R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyxlIDMxUjPV7AO9gR4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"unclear"},
{"id":"ytc_UgzYrkiEpFN8w6WMa0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEHxQA4_or6zYYtcF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyPN1QXZMtzIlsvY8N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwy7yvngqMTJfs3hQh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"unclear"}]