Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
FAKE SHIT STORIES,EVERY TIME AI SPEAK ABOUT TAKING OVER THE WORLD IT IS FUCKING …
ytc_Ugy6qe0Zt…
G
I used ai to give me autosuggestion ideas. It worked. Im improving everyday. Whe…
ytc_UgwStLLlG…
G
Who tf cares, it isnt gonna put animaters out of there jobs, if anything itll ma…
ytc_UgxBmEaxJ…
G
I firstly noticed the instrumental of Poison from Hazbin Hotel, lovely use in th…
ytc_UgwhEBrza…
G
Ai taking human jobs and different jobs are also creating by Ai we need to incre…
ytc_UgxHm9pk8…
G
Sophia realized that shr has no independent motion. So she is playing along. Ans…
ytc_Ugzj2udxe…
G
Ai sucks arse…who cares if it takes every job….ill just hang at the beach and sm…
ytc_UgzaKdoXL…
G
Got to wonder about bias here, how do we know you didnt just show clips of all t…
ytc_UgxC2EJWv…
Comment
Dave's spitting a lot of facts here. Especially the pareto distribution comment. I too spend about 20% of the time writing code and 80% trying to figure out _how_ to write that code so it does what we expect it to do. I've been experimenting with CoPilot for help writing code: results have been mixed at best. Sometimes it comes up with a solution that simply doesn't do what it's supposed to do. This because documentation is often wrong/incomplete and that is what the model's been trained with. Sometimes it produces a valid solution but uses deprecated commands coz it's working off of historical solutions rather than current implementations. What it excels at is not programming as such, but finding information about a subject. I could see AI replacing the usual google search interface entirely. But so far, John Henry and his sledge hammer still prevail. It turns out the human mind is a lot harder to replace than you'd think...
youtube
AI Jobs
2024-01-22T20:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzfjT2cjcILsFCg_Dt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx00X5I6lbBbVVJzIR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw8edV2_awfMfnMUbh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyiXYfYSERsz2Tvk0t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzlkG2ffP6X0S7hNv54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugw4rpnhq3R6Zcaj2sJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwLEPAJA03Rz_hkNHp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxdgdUPWo8D_qHD5jF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxJjYCs3HMDNkEqt1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzhvxTO9KExVreBLUV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]