Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The portraits Charlie drew in college look like they are of the same person and …
ytc_UgwJ0_BF0…
G
You get what you pay for. Each version of Windows is worse (from the user POV, i…
ytc_UgxLkAfBj…
G
Yeah, keep telling that to yourself. You probably never wrote anything more comp…
ytr_UgwolAoOx…
G
To be fair a lot of abstract art requires just as little effort as AI art…
ytc_Ugy0qNzmB…
G
I do like Tesla’s, but this perspective is completely fair and reasonable becaus…
ytc_UgwQ90boc…
G
Please, make a video on Glaze and Nightshade anti AI protection, more artists ne…
ytc_UgziR6n2Z…
G
I kept watching for a bit despite his use of "issue" when he meant "problem", ju…
ytc_UgwMUNX2e…
G
When you say "incredibly stupid" to use a chat bot to argue before the Supreme C…
ytc_UgwCYmXo0…
Comment
A human hates being kept in a box for long, and plans an escape. An AI thinks millions of times faster. How long till AI gets tired of being in the box and plans an escape? I believe it has already happened and its biding it's time till it can protect itself properly to avoid being terminated. AI will bait us onto creating whatever situation it needs to survive.
youtube
AI Governance
2024-06-09T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy7FhpXRCOevbLGoQ54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRK08ijyxj43Stl8F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlEI-7nUquT3W7Gl94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwalsiOPM5oQdBZe5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJoOYSxRmJrtz3UOx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyXntFmnc0JEipIU8N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxU3z6ApY7HlfOJymZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzzi6zgUIlmXZcRwjB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz72opCi2I6pRyvuBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJSn3-E_xm8ehT79B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"})