Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This episode really resonated. I see many benefits in AI and even use some AI to…
ytc_Ugzn2wT-q…
G
@JeremyB8419 You need talent and effort to create a drawing with a pencil, for a…
ytr_UgwQzf0bM…
G
Robots don’t get tired robots don’t get attitudes robots don’t have marital prob…
ytc_UgzIlfnGn…
G
It’s crazy that they would fire people first and then see if AI was going to wor…
ytc_UgzmM3Qgg…
G
16:50 AI generating Llamas in bikinis next to a pool: ❌
Drawing your friends as …
ytc_UgzJLcPqg…
G
Another reason to not allow self-driving cars without a driver present to interv…
ytc_UgzrfoLHr…
G
For me the ai starts to forget earlier plot lines when im charting too much :( v…
ytc_UgxUDIG3O…
G
As a law student I had recently presented on this on a research paper even befor…
ytc_Ugx9ofqm_…
Comment
Calls for AI control, especially ones that are detrimental to open source development are first and foremost faciliatory to regulatory capture. Even if we listen to these calls, there will be bad organizations who can dodge them, either because they have the money, or because they're governments.
If bad AI can be developed, it will. Attempting to prevent it by excluding the public only has downsides without any benefit.
youtube
2025-11-12T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxAMvj6m2t9zl4Dhux4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhiPV36QvZTm1rf0l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6WQdr0Z2NAZ5YKct4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlbDFp5NKOd1urW0l4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy5CLyLxLtCo-tUKvV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgxX4kbvAEHwypMzPBp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHuJq7W3a3tbditH54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBj-jLmoFf4p1S0Nl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzVXGS34OaNLYxu8954AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIvsG78bA3dS5Hfdp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]