Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is a serious matter. People’s lives and safety are in danger. There are eve…
ytc_UgwlAqxc7…
G
People who are scared of AI taking over programming have never actually studied …
ytc_UgwsQkq-l…
G
Until you get thrown in jail because chatgpt tells you to follow another State's…
rdc_jhcjk34
G
This AI is Y2K all over again. AI is so smart that I can just flip this switch o…
ytc_UgzlIjG9x…
G
People who actually care about art and artists will always support actual artist…
ytc_UgxobWax2…
G
Lol. So dangerous they’ll steal all of our bananas. Lmao. And for AI and robots.…
ytc_Ugyc-Yixx…
G
@happyshark405 '"ART" nowadays is throwing paint on canvas and seeing how the pa…
ytr_Ugwmhc83W…
G
In this scenario, both the customers and the products will simply change—most pe…
ytr_UgyDXdpTJ…
Comment
Relax. There is zero evidence that Artificial General Intelligence (which is a step on the way to these hypothetical artificial superintelligences) will be achieved anytime soon or in our lifetime, or even that the current advances with large language models etc. are even a step in that direction. They look impressive from the outside, but they may be complete technological dead ends with regards to AGI. These predictions are utter nonsense.
youtube
2025-09-05T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxI8v3mvweRKfCyzpl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwemgYUWOIjAxROOcR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5CL4kFI7NTureCM14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwOo0EIn-8UMSIW9BB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwkzIbPpkXO8cijaVV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYK4t6Bj3IwOR-58Z4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwdzSe5Mj7R5_2Zs1V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYa9Mzf4XlEBFXINJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZnU19qSo_0MACY7h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvztazTRPotoa2pOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]