Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah because there is no other predator to take us out. So we gonna have to do i…
ytr_UgzWllI7t…
G
You can't be smart without being conscious, can you. I'm not talking about smart…
ytc_UgzQaA2xs…
G
Who will be repairing a turbine buried in a dam up the mountains? Not AI… nor AS…
ytc_UgzoO5fiw…
G
There was a rather horrific short film where an AI is activated for a tech demo.…
ytc_UgwGn0ibG…
G
She lost me when she starts talking about “climate crisis” and people lacking “c…
ytc_Ugy2humJ_…
G
@Randomizer7-z8bi believe consciousness is evolving but I don't believe machines…
ytr_UgwEt4fii…
G
Hey everyone ! For my French speakers : as part of my master’s thesis in psychol…
ytc_UgzQJzTb8…
G
Ai is just a way for others with no artistic skills to create content at a fast …
ytc_Ugxc1Cx9L…
Comment
Yeah that's the big problem with AI for me - I like having the option to use it when I need it, but a lot of AI tools keep trying to be helpful even when I'd rather do things myself. I think it's going to be particularly an issue when it comes to learning things, it'll be much easier to become OK at something, but much harder to become good at something because AI will keep replacing learning opportunities with autofilled solutions. Ironically, it's even a problem in AI use - image generators now have prompt generators attached to them, so you don't even have the opportunity to learn how to consistently prompt what you want to see because the AI prompts for you.
youtube
2025-07-21T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzJeM4vf-99-DTSKhJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkL3E8WrwmED7eMKF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKGAPlm4aVPNyVnih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxm9wDXvgVnmubN2FJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHen8fyC9aOdYya2R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMw5nDgqM3iSX7Hn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBqP4mjrTUaC_cABV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjJxPzURp5SDrbVxh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw8e-1YnFPvZudnGup4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLMdNcQ3XtpBnQvP14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]