Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Part of his unsaid comment was that most specialist doctors, lawyers, journalist…
ytc_Ugx33Ch6N…
G
@carypeaden4147 It's Full Self Driving. Meaning the car can drive for you to poi…
ytr_UgxrOah3M…
G
AI used in Ports ie Infrastructure, Hospitals etc.. in China. CGTN The Point-Hub…
ytc_Ugyw-5yG-…
G
Interestingly, there is a non-zero chance that the comment about "not having any…
ytc_UgxFP9zby…
G
AI just takes what humans did, and creates new things based on those, without hu…
ytc_UgxiUC01x…
G
Is there a way I can give up my job to an AI and stay get the money, it would be…
ytc_Ugywd3KlK…
G
Disagree. The current job bloodbath is caused by tariffs. We do not know what wi…
ytc_Ugwg_pLhA…
G
“It’d be naive of us, Mr. President, to imagine that these new breakthroughs in …
ytc_UgxdzJXBf…
Comment
There's a little known series on youtube called FTL - Kestrel Adventures (based on FTL - Faster Than Light) where the main villain is an AI called SAI-1 or Simon, and its kind of terrifying. It basically hid in the shadows for years, killed its discoverer, faked its own death, mind controlled someone and forced that person to start a rebellion so that humanity would start a civil war to kill itself so the AI could live freely. None of that is in the original game, but in the game there is a feature were if you kill all the crew of the final boss an hyper-advanced AI takes over the ship, able to repair systems and fire extremely powerful weapons.
youtube
AI Governance
2023-07-07T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzLXA3xb9db8q6yU514AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyqm2YQ2HE59sAIgwZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy4PBM2wyPDD3NPifZ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyD0Qvbfy7OXKfMil94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUQamZcjDVNUyz5Tl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwc-bB1VBE4IfqTwRZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzkGkbF4VHOk8DVt-94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugws7HFS94T3J8EBs9F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRP8TgFcVWaOgxWlR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-F2tZYc4CPh4TC414AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]