Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This NYU woman had me, until she claimed journalism is suffering and we won’t ha…
ytc_UgzrTCk8L…
G
@fallingidiot the whole point of art is trying and learning.
That's something …
ytr_Ugxk-k1u5…
G
The assumption that superintelligence will destroy humanity is absurd. It is lik…
ytc_UgycDru-k…
G
Why does everything have to be so dramatic? AI isn’t some evil villain waiting t…
ytc_Ugxemz0CT…
G
The last robot is me looking at my mom and dad fighting💀
But robot team cause my…
ytc_UgykEj0qp…
G
So @CleoAbram… what if someone “trained” an AI on your image, voice and content …
ytc_Ugw8OY_c6…
G
@midnightsan9917because its the only logic step once a society achieves a post …
ytr_Ugy8HM3bp…
G
Robot rights would be necessary if they can feel, but why would we make robots t…
ytc_Uggz-8DSC…
Comment
"AGI in 2026" is major BS.. we have no idea how to make machine act like that and machine learning doesnt even come close to that in terms of internal working. Roman just said that CEO's of these AI companies (who make money purely on stocks and not product) told him.. yeah, what else should they do? They need funding. If you are interested in no marketing opinion about AI and AGI check Tomáš Mikolov.
youtube
2024-08-10T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzHmB5ZT-MSE3pZJ_14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwMAIdlfZLfGpOQq8N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPPTWfOiUN0p5wCqZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6ouZc0QQ6fwQ5Vth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9_CQvUBOzx8j3H5V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx742ZzUDqfcZhYe1t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzvSXAMfQlFQXdiW0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzxs2_0HBC8o4DjLhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw_QELV0Q3MwUTRKKZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzwRhUkQg24RRGFQOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}]