Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
bro if the robot smashes you on the conveyor belt that hard bro I'm going To be …
ytc_UgxPbFppx…
G
If the good people of the world, is doing NOTHING to avoid the horrific slaughte…
ytc_UgzxfGynG…
G
So somebody walks across the road in the middle of the night, right in front of …
ytc_Ugwwv6PTv…
G
AI will be the future. But it’ll only last so long. Each day it’s being perfecte…
ytc_UgyTW4gfp…
G
They already need to build data centers the size of Manhattan now. And Grok etc.…
ytc_UgygAtXqI…
G
plot twist: the people watching that were AI, too and the video with the wooly m…
ytc_UgyLeeTDg…
G
"AI" is much more of a marketing term than a scientifically meaningful one. Gene…
ytr_Ugx31Z7j0…
G
... I'm getting tired of people getting scared of thsi stupid ai for no reason.…
ytc_UgwOFjqz6…
Comment
the only way ai will actually be useful is if we can simulate our universe and laws even if it's just earth, gravity, and all the laws of physics, including newtons law (the only one that comes to mind sadly) then placing 2 ais that have all the data we as a species can provide, stick em in the simulation. speed time up by however years per second etc, in hopes the AI isn't flawed and or hostile, make contact, and learn from them while keeping it in a controlled environment, either staying in the simulation, slowed down to a 1 to 1 scale, or by actually progressing out science a little more and make it into its own computer with two AI minds. ofc these ai can't have any restrictions set on what they can do and or provide. "peak ai evaluation" is its own consciousness right? idk
youtube
2026-04-17T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw4kuxKKJZY3umXj154AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz_T-iw7Iow_RmEAvR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqxaluM6jposzJGn54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYJenIO5LA3Ep_6lV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgEX-ptB511Z1b8f14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwZ9byjmvKHaRbotcp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzxAePq23vpmmiVE0J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxeRh_ODLk2QPUOymd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJU43W0Jb9qRrNCFh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQNAtDT15vZKy07oB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"})