Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When AI figures out the profitability of destruction (to it, the intrinsic "supe…
ytc_UgwxOqyP1…
G
“AI could never replace ANY human jobs”
“AI could never replace customer-facing …
ytc_UgxLODvYA…
G
@Joephnarro In Machine learning, we have 2 methods of prediction, regression (pr…
ytr_UgwjGpdwI…
G
Well just penalize company's: If you replace a human workforce with AI (example …
ytc_Ugwk9MWe0…
G
AI stands for Artificial intelligence which at base definition means intelligenc…
ytr_Ugwf7HBsV…
G
What I call "algorithmic culture" seemed like a big problem a long time before A…
ytc_UgzNnD5GI…
G
No one says AI is specifically evil. But if you wanted to build a dam for renewa…
ytr_UgxUWcrf7…
G
6:54 evolution dictates that humans as we know today will not exist at some poin…
ytc_UgwLog70v…
Comment
The fact that we can communicate across the globe in any language but our first thought is to use AI for military just shows how flawed we are. Why does war even exist still? Couldn't we just play a game of Call of Duty instead of actually murdering hundreds of people?
youtube
AI Governance
2023-07-17T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxWkqO2pXoYWhhfEqF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4ah8wrzJEAUaJfZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx--zg7mIdhAlnCY-x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5ytJnBefBk8ihkrF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtP2atgafI1p-52614AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyv6Q3RLs52VVBdsDx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvkiIV5Ejdv0nLSxF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIN-itGtKx9v-tR6N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxnYLy_bezNtNvAx0h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxhXNH3EsmH3fXlcc94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]