Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@lolandie the level of of hypocrisy you’re showing is astounding. AI engineers c…
ytr_UgwJImAiM…
G
That's an intriguing question! The relationship between concepts like reincarnat…
ytr_Ugx8EjnS0…
G
Elon needs to do a better job negotiating the terms of his involvement in AI fir…
ytc_UgwJsFYsK…
G
AI is making solo creators unstoppable my video automation setup + monetization…
ytc_UgwUBLKfR…
G
Automating art is an unfortunate inevitability of creating a system that can do …
rdc_jj5nnfz
G
programmed todo it wont think. answer is correct or wrong. he not know. result n…
ytc_UgzL9sLAt…
G
One problem with Dean's hope that AI companies wouldn't release a dangerous supe…
ytc_UgyLt72yz…
G
Make no mistake A.I. isn't intelligent, it's just a soulless, data harvesting pi…
ytc_UgzjUqvfM…
Comment
Great analysis and I completely agree. I've been coding for a couple of decades at this point and though AI has definitely helped with some of the more routine and monotonous tasks associated with programming, it doesn't really solve specific problems. At least not in my experience. And it only does that correctly about 80% of the time (again, in my experience). Programmers aren't going anywhere anytime soon.
youtube
AI Jobs
2025-03-10T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxud4LqcRrweQhTGEJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4WKaEYC4BRSFtAt94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1eZvFyhad2orxFDh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRChARNE5Ewn0Lg_Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXNQsjbsORlDwLghF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz07B8twbpWWcg9SoN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUrBmxa2jgOY9jg554AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKsmRwKOGNll29n8V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbLlIry3bEXi3UO9F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyn_0Ezmp66TDp0OXB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]