Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly i really doubt it, if we are looking at "AI" as Generative models, they…
ytc_UgwlZsi6h…
G
No. Homework no responsibility no grit to tackle anything without AI. Training w…
ytc_Ugwo1ZNa9…
G
I just wish the kids felt safe enough to come and talk to their parents or an ad…
ytc_UgxOYNhUE…
G
Everything is done to blame outside forces. Like AI or UFO’s. Who gave who permi…
ytc_UgyfJQTkQ…
G
“Umm actually there ai prompt engineers,” 🤓 , like no they actually call them se…
ytc_UgyAK7sFC…
G
In no way am I trying to be "smarter" than this guy, but... We've been hearing t…
ytc_Ugx4W-2hM…
G
This is one of those things where yes ai is overall safer than HUMAN drivers but…
ytc_Ugz70pNbd…
G
@heatherhansen2910 We shouldn't remove any current limits/laws requiring safe AI…
ytr_UgwH6GYZl…
Comment
Why is this relevant/ Who the fuck gives a shit if you know how to code? Coding is fucking useless unless you can build shit that works and doesn't break, solving a problem. 90% of software engineers can't do that, just brick layers. So essentialyl repalceable, atleast with AI you just train them enough, and get moving, fi there is an issue they can figure it out, and anyways the smart 10x engineeres sovle most things anyways, and this ain't it, and they will be 10x with just AI and will understand first principles.
youtube
AI Jobs
2026-02-19T17:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwpQxZjy3OPpW6dEc54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgxcPCqhhlHAm_CV4Xd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxT75Ge6L-gel8cTWR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwIj40eHoYLW-gp4jd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx5NCGoCd8fsdhJ_594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzWYZJi6fTlzGe1mwR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxjG2EK8oteDDq2Xvx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugwj6H8YY3CgklNKuUp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzkfnAl-0RYDFFCa254AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_Ugwj29R9mkvu-EsZZdd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}]