Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an AI story teller named Marcus Leo Chen, I co-sign these comments & sentimen…
ytc_Ugw1bspfs…
G
You all can kiss your jobs goodbye. 😘 AI is a godsend for these greedy corporate…
ytc_Ugyzum-FA…
G
funny thing how the people who can't make an AI are the people who have these co…
ytc_UgxzCT67C…
G
I think if she would had addressed this sooner as AI generated then she probably…
ytc_UgzJFgfw0…
G
I've been a fan of your work for a while. I worry im putting some sort of target…
ytc_Ugy6ktn4M…
G
I envision a future with ultra competitive job markets because of more jobs beco…
ytc_UgxAmkC3g…
G
Real artists just create because they enjoy doing it. It's almost a compulsion.…
ytc_UgwF6W6NL…
G
just make it illegal to be used in professional settings, I have nothing against…
ytr_UgxYcQAmV…
Comment
In my my experience talking with engineers working in AI , the overwhelming majority understand that the LLMs presently in development are - at best - a small part of what AGI will be, like the frontal lobe in the human brain; specialized for a specific task but insufficient alone for intelligence. On the other hand, the most pessimistic see LLMs as a dead-end, fundamentally incompatible with AGI. Especially if the latter turns out to be the case, a radically different approach will be necessary, that we've yet to identify. In either case, though, its not clear AGI is any different than physics' promise of Fusion.
youtube
AI Moral Status
2025-10-30T18:5…
♥ 268
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxFRZ3ULDDHYaVFVa54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwpBMyuyzK-7qhVLfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0-5WwJf846xl2_8R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKsvre-Pndgqw1PnN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzOuqD7kyxc9-ouC594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXzt7wWl9tcxfpmVh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdhM7CJ9AOS3XZOYt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzF4EYAm-1EQ2_o6pl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9yVxH60zBVEnhqIZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxxy_KXEeL86_ndwU94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]