Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@michalw8865 it is never to late to save humanity. We dont need to completely ba…
ytr_Ugyq5wjE9…
G
We need AI to verify AI! I think the Gartner hype cycle needs to be understood a…
ytc_UgwEBds7A…
G
"ai artist" yeah right more like ai image generated users- they arent artists...…
ytc_UgzU4Jf5Q…
G
It all just depends on what prompts you put in and the type of conversations you…
ytr_Ugyi-7SHO…
G
So then Canada will send most troops to Europe to defend their allies and Trump …
rdc_mcrv8cn
G
If at least most of the cars are self-driving, and have the ability to communica…
ytc_Ugg6S0mO_…
G
AI can’t replace human emotion and connection.
BTW-I REFUSE TO USE SELF CHECKOU…
ytc_Ugw1q8-Wm…
G
I hope the internet doesn't die😢 ... But I DID see a lot of AI generated vidios,…
ytc_UgyHGIvrH…
Comment
The problem is that it's very difficult to understand code written by someone else, even if it was written by AI. Another problem is that managers believe that if an AI-driven development takes one hour, then the total time spent on that task is one hour, which is false. For every hour of AI-driven coding, there are at least two days of testing.
youtube
AI Jobs
2026-02-23T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwMHTi_t1HZXXlSN_Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwZQR2twYOdWCuRz_x4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZ3-vZ1m26PB9E9c94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnJwYmbR6eNnv3u8B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugway3AN4wFa4CQBJiB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6bqNgHox8UbYNprp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxMNB1nZyIEClPE_H94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxAeX46NAI9J6iQDq94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyzuTWdo-VyD1eOc_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIG6_xazn-vhH-2Qt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]