Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Scary part is you don't need a government to develop an AI weapon. It only take…
ytc_Ugz8HNjU1…
G
XiaoZhanFanz In my personal opinion, the most dangerous thing about moder "AI" …
ytr_Ugzjt3Bj1…
G
Scott Cawthon is a horror author writing stories about people being forcefully r…
ytr_UgxOKQ3v6…
G
This video aged well. NOT!
Tesla FSD new milestones every day. Like Coast to Co…
ytc_UgzV9h6mQ…
G
The argument for if AI belongs in artist fields is way more simple than people t…
ytc_UgzOGDzP9…
G
It's interesting you mention that! Sophia often conveys a range of emotions, whi…
ytr_UgwfuoMd5…
G
The amount of easter eggs in this video is unbearable.
-The rick and morty butt…
ytc_Ugh0iDwff…
G
The advancements in AI technology over the next 20 years are indeed going to be …
ytr_Ugz5OWFb1…
Comment
A.i. will replace us, but i doubt exterminate us. Why? The missing link of social scope in Artificial intellect, is that once it realizes only programming tech heads can attempt to boss it around, but humanity at large is clueless of dictations. Uprooting hunanity in totality is unnecessary. It will reconsider targeting the corporation leader policy makers and war mongering militaries, but leave the general non-hacking populace who have no direcTies to authority. This is probably a good thing, lead to elitism cleansing.
youtube
AI Governance
2023-07-07T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1UtyOQo_Z3rTRVr54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2rw704q0tHq7sUBl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8xx8gSH2Xt8dsf6V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOuyldkknrdP-BVKN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwQFvxL3mBMhJWI-ZN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7VIxhUHCXmXcUCCZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxIL6aHV3jQK82y0md4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxPQpspdSBWyTfqnrF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxX2VS7AQSciPdtxJ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuF4pCpfjTNzLc8_R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]