Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is the nail in the coffin to cancel Premium. I can't justify it anymore ove…
rdc_jskx920
G
Alarmism is a noun that refers to:
> The tendency to exaggerate dangers or to s…
ytc_UgyKBxu5r…
G
AI is only but a small piece of a much more complex interface buildup between hu…
ytc_UgwzBLxeh…
G
That's an interesting analogy.
I see it like this:
AI art is more like a beau…
ytc_Ugw40nrbd…
G
I think AI destroys and de-values a ton of knowledge work like programming, acco…
ytc_UgyTlapT4…
G
Imo AGI is in the same category as Nuclear Fusion - both have been 30 to 50 year…
ytc_UgxZDhuk4…
G
It seems the godfather of AI has already issued plenty of warnings. It would be …
ytc_Ugz9JMtI0…
G
So Ai will implement all the good of us? What happens with the bad then? Erase? …
ytc_UgyJqUeZf…
Comment
I've gotten fatigued with AI at this point. I think I rode the entire hype cycle over the last year. I was first fascinated by it, then became just meh about it, then disliked it, now don't even use it. I feel like both its utility and its threat are all hype and discourse and it all ends up as a net neutral that makes the technology quickly obsolete in my mind. I'm not sure if it will find a place, feels like it could be like block chain or virtual reality, a technological innovation could exist but if the wide populous has no interest or dislikes it, it will go nowhere.
youtube
2025-09-26T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwOaYclmw6hzZ9aic54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwImB_UQIWaT7aUieN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz5XKlvRBxYvjBbOKl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf83Zqk0RZXi6jazR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbDaqE3pquBYHMa7B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxP7z-Nvn2yWQJ0lGN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxOkCaNsg8y2fhyyx94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwIuOVCDESs38qQz2R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQmh8pi6kudxygF-Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxdw08m8QRVwcMB8dB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]