Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@TrumpSupporter2024 So what you are saying is that artists shouldn't be compens…
ytr_UgzPWCu1w…
G
I feel like if I reply to this post, then AI will just use it against me. So ha …
ytc_UgyoWbedd…
G
Human beings are weird. AI is weirder. It’s not a bad thing to be human, it’s an…
ytc_UgwBxec4n…
G
If our government ends up bailing out these tech companies for trillions of doll…
ytc_Ugwmj-8tu…
G
I agree with that, even no popular artists might notice their artworks in the AI…
ytc_UgzemvYNK…
G
I would say that the marginal productivity of AI goes up with some humans in the…
ytr_UgxgfLdx4…
G
Yeah, this is the way to do it, also works with generating images. Examples: par…
ytr_UgychT8hm…
G
this story about claude checking a mail server for information about an affair a…
ytc_UgxLn_H9O…
Comment
Commercial AIs are old. They are not even the cutting models that they have internally. They are far ahead of where you think they are, they just have not been releasing them. Why? Because humans. When what they have has the core foundation to eliminate 66% of all US jobs over years to a decade, would you release it all at once or test it in low quality grades with better progress, but bugs. This is strategy. It is not LLMs. LLMs have a couple years to go before modified LLM world models are scaled. World models are still based mostly on LLMs and or diffusion models.
youtube
AI Responsibility
2026-02-13T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxO-l2cv8SdBwVFs7B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2wGdkUgVMStQ92714AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwjfbPZvQ-qZ2daGO14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXzIsvYd_TQiePhwB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyE_0h4RhSlVlDLeg94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxVsfmlVD_ZFHGqral4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxP9Te7AIzikjhQfGV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyws6KLpJoeijLJj5J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyALsN9atZzB6sNXOV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFLzCp97xutgjrhex4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]