Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think there is arguments to be made that if an AI can suitibly replace an acti…
ytc_Ugy1Znvtk…
G
When they do "learn" 5 different things in 2 hours and all of it is in a.i. to "…
ytc_UgwxaI3Rj…
G
Analogy.
Malaria kills millions of humans, so humans developed a method of DNA …
ytc_Ugwb5ugSK…
G
all these people that are responsible for creating AI dismissing and avoiding th…
ytc_Ugz1KHb1b…
G
The threat isnt LLM it's all the idiot c suites that think jobs are just people …
ytc_Ugy4Qsdop…
G
Even if the disability argument was solid (which it's not, people will always fi…
ytc_Ugwkg3SI5…
G
One of the scariest parts is AI could fulfill multiple forms of the “end of the …
ytc_Ugzi3Quli…
G
I really don't like this. I think AI is a very slippery slope, we should proceed…
ytc_UgwklwAae…
Comment
What's so bad about all this? The tech companies are draining each of the states of their electricity. Second, they're passing the cost onto the public. Third, it does not work AI that is 4th. They can't find a use for it. At least the public to use so they can generate money, but they are still draining the United States of money to pay for their tech failures of the Hi-Tech corporations
youtube
AI Responsibility
2026-02-19T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxO-l2cv8SdBwVFs7B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2wGdkUgVMStQ92714AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwjfbPZvQ-qZ2daGO14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXzIsvYd_TQiePhwB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyE_0h4RhSlVlDLeg94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxVsfmlVD_ZFHGqral4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxP9Te7AIzikjhQfGV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyws6KLpJoeijLJj5J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyALsN9atZzB6sNXOV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFLzCp97xutgjrhex4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]