Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy reminds me so strong of the guy in the movie where he makes a girl robo…
ytc_UgwjZtjfi…
G
Look, i dont give a shit what stranger goes through the shit ive said, i use it …
ytc_UgxprMh3a…
G
Not to mention how bad it is physically for all of us to constantly r touching e…
ytr_Ugx9TDnF9…
G
I think that's too generic to say all jobs will be replaced by AI. There are job…
ytc_Ugz5H06L8…
G
If someone fails to use a crosswalk, that does not give you the right to hit the…
ytc_Ugy1ebF22…
G
Isn’t ai just everyone’s knowledge but together and accessible. By that logic , …
ytc_Ugy4EFdx7…
G
What no one seems to be addressing are the safety issues. Driverless trucks are…
ytc_UgzCijwuw…
G
You mean AI WILL. Better find a solution because it’s too big to stop now.…
ytc_Ugzqvro3C…
Comment
AI will destroy humanity. As a retired programmer, I feel certain of this. As a programmer, I used to tell computers what to do. When they start telling us what to do, humanity will lose its freedom of choice. When humanity loses its freedom of choice, humanity ends. Film confirmations of this hypothesis: "THX: 1138"; "2001: A Space Odyssey"; "Colossus: The Forbin Project"; "The Matrix".
youtube
AI Jobs
2026-02-23T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxO_UJTTjQFsX5x6Fl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyD7LLLdaBcy9UHPml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkXkeuWWNDZtGL0bt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPOvxFKxNoW0_8pqV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZZfxrNpXXWlW7IER4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyNVeKGdai3Upv77Bd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaV_fJXd7LeGq7F9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzk6sK6VpbT_d6RIrt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwLOE9XtVI4wM32GZx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz4_7_ds_KKVLGBqDR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]