Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its not even new ideas... ive seen ppl in some real ai tool forums (not just www…
ytr_UgzdefEmP…
G
This right here seems more correct. My father owns a small oil company (think re…
rdc_czlk399
G
I like AI art, I think it's neat, but you need no talent to generate cool images…
ytc_UgxbQKGK0…
G
Shade is hard for flowers, but ferns are great for shady areas. They don't add …
rdc_eh5ffre
G
This is the scariest and most real threat from AI, imo. Not the singularity, not…
ytc_UgzVZU0S9…
G
3 white collar Silicon Valley boys who don’t give a shit about the people involv…
ytc_Ugzx7UQ5_…
G
People (including ones who don't really understand the code they're asking for) …
ytr_UgyEggkvP…
G
I think that person did it on purpose to unite different artists together this A…
ytc_UgzJ0HnBC…
Comment
It takes less than half a brain to drive a car. We have been hearing about driverless cars for a decade now and they are still no where near ready to rolled out under any road conditions.
Something as basic as driving can’t be learnt by a set of code you really expect AGI or a chat bot to replace knowledge workers or end the world? It’s total gaslighting.
It’s a way to get more funding, it’s a way to justify the ridiculous spend.
Even the latest chat bot can’t answer some simple questions properly. AGI is a very very far!
youtube
2026-02-15T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyRvPEw2SbPojttx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzyh_4iEd7TydqUL6l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuHc8LJqCDEDeDNK54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwhJF-XhR6ojpwHTL14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxelKkSpX1xw5Cdlmt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy_WBWTRJeLl3KxLP54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgAKJtZJwYHVjKVTl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5WLouwDAiAj8789F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIl4_KmSn6WdGV5A14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJG3RwvyuvHytdxNF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]