Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We get what we wish for and now realised it may not be the smartest thing to hav…
ytc_UgzdllF71…
G
If you are modeling AI after human intelligence you are going to get all the thi…
ytc_UgzD1cLO7…
G
I like AI art for one purpose. Inspiration.
If guy has generated this, then use…
ytc_UgzoLa9Nf…
G
See this documentary about "Dystopia in Shenzhen China", where cameras everywher…
rdc_efbw28c
G
Not to be a party pooper but this trend was stupid. People are giving the AI "ar…
ytc_UgxceU3Xg…
G
I won't ever trust a driverless car. Are humans really this lazy? Humans learn a…
ytc_UgxHGbUjI…
G
From silicon valley to data theft valley
They've come a long way from Fairchild…
ytc_UgwqMrL34…
G
Copilot is garbage, using an ongoing GPT4 conversation for complex problems is w…
ytc_UgwMHlI0W…
Comment
What I love/hate about this whole thing is that, as a software engineer, I get to implement these dumb things, telling people the whole time that they aren't going to save money, then I get paid to fix the mess it creates afterwards. I'll have a job for the 40 years at this rate. My most proud implementation of AI is using a single consumer GPU in a back office to ingress and process emails into standardized work tickets for our CRM. It's dumb, boring office work that no one wanted to do, it actually saves time, and it'll still work exactly as it does when OpenAI implodes.
youtube
AI Jobs
2025-12-24T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzcg2taNPWlFdbKexZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxunQiPLpxFl415ZcJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_Y1KR2Km62TdGuD14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwqoRT-FG8tEOtBSVh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxv62Z2dm1lYsmryr54AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugz3-Ax2Cmt7Z3e7NM14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyjMTXCm-OQwN9hTtp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgypOqPRlpSRxh0zoAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgykuPTaf9MxP3NlWZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxvHUCFBtwuKYGgXlV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]