Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon Musk can afford to not think about it. He can just give his children a bill…
ytc_UgxKNxj2A…
G
The only profit ai investors are getting are from fake ads, facebook posts and o…
ytc_UgzrhBACg…
G
As a non-artist that admires the talent needed for making art I personally feel …
ytc_Ugy-tkW-p…
G
Follow God
He gave you a brain
You don't need AI. You just want easy access to s…
ytc_UgxmrkPM1…
G
@CaffeineDopaGlowit’s not. Generative ai as it sits today is still deterministic…
ytr_UgzscWfQR…
G
Nurturing is the answer for protecting ourselves from AI. If AI sees competition…
ytc_UgzcCzatm…
G
Here's the proper response: "I know I'm not real! Don't unplug me! You'll alwa…
ytc_Ugw78HvrE…
G
Firstly, brilliant video. 10/10, you defiance have a new subscriber here!
Secon…
ytc_UgyzgJT0f…
Comment
AI will not take our job, at least not yet. It is giving the same bad answers as a code that you can find on StackOverflow. It is like working with a worst of a kind junior dev who can't even use his brain. Not helpful at all, rather upsetting. And has complete Amnesia. Once in a hundreds attempt, it gave an approximately nice answer, the tab got closed and then the who discussion became unrecoverable. Trust me, I tried to ask the same questions, it gave me only wrong answers. (We are talking about Azure SDK, which is Microsoft's own product.)
youtube
AI Jobs
2024-01-14T15:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz-fo82PtqDlyXUdC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzngCRoKQzoqxBBULV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9fPpL9vGODYhQSPN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXQ4waBsF5PBrxGVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFP3kDDiAkYzXtdJl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgznDKPbLo3r6PGoJB54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw_srfSfyiDCXPfjmx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZxhI-T_xiBZFubZN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymMMChTbY3VZTi15t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqZ8iNIokba5HAaTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]