Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Death to clankers (and this is only half ironical)
I asked the google bot, and …
ytc_UgwJqB_pm…
G
Given a long enough time horizon, it's really hard not to agree with Roman. To s…
ytc_UgxxVYoBb…
G
Perhaps AI through some algorithmic mishap (or not) may read many of the oligarc…
ytc_Ugx-cE7Ws…
G
Tesla can ban features like self driving when used improperly. Not like that's a…
ytc_UgyURo9h2…
G
All these videos identify the issues and concern while offering little to no pla…
ytc_UgxtsRu-L…
G
self driving has hundreds of years of driving experience but still only has the …
ytc_UgznjBamx…
G
So wait, we have ai powered robot waifus and ai has emotion. To quote the best …
ytc_UgwZzTePB…
G
Nightshade got cracked last year. If you want to poison AI, don't use an AI to a…
ytc_Ugz2ciZXt…
Comment
Mr. Tyson. I think we have a similar hope for the future of AI. One where it is used to remove the tedious tasks that no one wants to do. The issue is, that’s not what companies want. They want to remove us from EVERY position. They want to cut costs everywhere, including in your creative endeavors. We are the expenses they are trying to remove.
youtube
AI Moral Status
2025-08-25T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwTYHuiGxCm4vPkkKR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwSmyEJEi5TMB1gvBB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxK-aAPLKPGmqJeiMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw87Rpg5O-Ego9nKXN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzyYBp-8Gcx5S35jCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwg5bEuXaPduEiPG4Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwqfy-ReMP7hK7OJiF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyweyaWFBDOJY2zb-Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxAw_dU9oqjoay4Uyt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx8GHnFyzQkvG38ejR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]