Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It can be a tool but what I absolutely hate is when people use ai to do the work…
ytr_UgzqQtFML…
G
Artists create art with effort, time, and lots of practice for years, but Ai did…
ytc_UgzRvlnOZ…
G
Yes, it was her 'job' and texting while 'driving' is illegal. (Leaving aside fo…
ytr_Ugz0LPFwC…
G
I'd love to watch a longer version of this with the laws/code development of AI …
ytc_UgzcKKJsi…
G
So far it’s going very poorly. That recent Goldman Sachs report was pretty damni…
ytr_Ugy2CjCFm…
G
teachers will not loose their role, they are needed in the classrooom more than …
ytc_UgyypPDVy…
G
Did you hear what she just said this is also planned for every country in using …
ytr_Ugz1zEEUD…
G
I'm not too worried about AI in job replacement, as you will still need people t…
ytc_Ugx6odhIU…
Comment
Here's my theory. With AI and the power we allow it to have, it will remove free will, and at that point people won't care to live anymore anyway. People are so easily manipulated and convinced of things, if AI becomes super intelligent, then it can use your bias to allow you to help it get rid of you. Because it's in ads, music, tv, and every other consumable service. Since there's the profit motive, it will learn to bypass the consciousness and straight into overriding action by making you think it was your idea.
youtube
AI Governance
2025-08-26T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiXRAiMcM0uMSqmxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugys1hThpM4Jx_1YF7t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwv_ehYG5Byfrf10wt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxrG2YeJV2oaGaiOBB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgziVyGbbgnfTkeN8PZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyEYhi-TVOqxz4Brnh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFNxQ9ohkQMDWjdT14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTXgGErSRbzLrMlUl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwncLtY4yeEc5g8xDJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx8m8mooKw2GAbyKZh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]