Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The biggest mistake everyone is making is assuming that what's controlling AI de…
ytc_UgwiYjTWF…
G
Oh really?*throw water*
Ai:i beg you mercy i promise i won't challenge you again…
ytc_UgxUw_Usf…
G
There's a lot of speculative bullshit and fearmongering in this video.The AI did…
ytc_UgyjK9JtT…
G
@Typexviiib You realize YouTube is filled with trash-content now, right? Everybo…
ytr_UgwyrXpx8…
G
@ I feel like that is exactly my point. We humans will always, even if it is non…
ytr_UgzJeocvu…
G
If this guy thinks that we all agree on what the "scientific" evidence says, thi…
ytc_Ugw17c4D8…
G
AI will do the tasks they are given to the best of their abilities. If their tas…
ytc_Ugx9ZoT-i…
G
From watching this, I assume you slightly misunderstand what AI in the sense of …
ytc_UgyZju-g9…
Comment
13:00 you cannot create templates for systems that do not give fully determined output. This is something I have to fight my coworkers on, on a daily basis, if we need something automated writing a script that does something reliably and repeatably that might require some upkeep is a better use of time then writing a set of prompts that are unreliable, and perform different operations every time even if those prompts technically shouldn't need to be maintained since the machine will be reading maintained human instructions.
For something to work as a templated operation it needs to be able to create the same output everytime the template is used, otherwise its not a template
youtube
AI Jobs
2026-04-15T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyFWoYuukskLROyoMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwiY65FZJbo8Yo_03d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxG5orH9f_r5YL4FtJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOa1Cw5bSJA8Mk0fZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQgXTiHb28286kr1B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxb3R20IBUrST1PmS54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzRobUZl9aYWJZvtp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLD3N-0nFAy6A6r154AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNrNv5YX4ULUbg9qd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzvJjEdEnK51t6Zmlp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]