Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fr, when im talking to A.I im now saying "pls, and thank you" just in case they'…
ytr_Ugyr36-bm…
G
I can't help but think that games of war might be a way AI learns things we don…
ytc_Ugy21I1ZP…
G
Tesla's Legal Team: "Your honor, no reasonable person would assume "Full Self-Dr…
ytc_UgxuB9_CO…
G
@tlightning8383 These are exactly my same thoughts. There is this notion of n…
ytr_UgzaoZm0W…
G
As per GDPR, AI should not make automated decisions without human intervention. …
ytc_UgwpaHg7W…
G
Yeah we should definitely continue with electric and self driving... Not only do…
ytc_UgztyZz4x…
G
Im gonna start saying that AI art is ageist because babies can't talk yet, and b…
ytc_UgyotXhnx…
G
I mean, agreed....but from a man making a fleet of AI connected EV's, brain mic…
ytc_UgyxhUeht…
Comment
46:00 if everyone has free time what is the point of creating an industry when nobody is working anymore? even the businessmen lose their purpose. everything in your life would become about keeping yourself entertained and unless ai is taking care of food and shelter for free, people are going to start dying in droves due to not enough food being produced because money doesnt have a "value" anymore so what is the motivation for farmers? furthermore does the AI know each persons specific nutritional requirements or does everyone get the same amount of food? is it unlimited? is the cost of AI mechanized socialism? because make no mistake for that AI utopia to happen. EVERYONE would have to be equal. we all know that has never been a true statement. greed is going to ruin us all.
youtube
AI Governance
2025-12-26T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgydjZ9MgAmUnsvLPTB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsQIWs0TcbDzBVRE14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwldXnKLE2J4WR5F554AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2K4bYkgy-3upds-h4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyU-53RzPUSXUTkgFB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyy2oEvC-uq5DlT3nl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyuPq9mWPjIP8ar4o94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgymvXTH58x4rhgG50J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwsFWzRcvLl0jJDhI14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxuZ_8S0_X42rv8RFp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]