Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The guy was a studying computer science so he knows how to prompt it and teach i…
ytc_Ugy59ASvx…
G
welcome to the internet sucks that its this way but crying like a baby on camera…
ytc_Ugzy7Xfiz…
G
Nonsense that 1. Any autonomous drone is cheap compared to a modded DJI, and 2. …
ytc_UgzdmV-wT…
G
This is the first time ive seen lavender but im in full support of this poisonin…
ytc_UgyFkktmg…
G
AI is part of automation, automation is not done. Best outcome would be that we …
ytc_UgyEl2Od0…
G
Its a human issue not an AI one. Humans are the ones choosing to trust in and fo…
ytc_UgxLKsgGW…
G
They want robots and ai to replace all our jobs so they can become rich while th…
ytc_UgxjnSt8V…
G
Qualia is woo woo.. the redness of red is based on our prior experiences no diff…
ytc_Ugz2L_8kJ…
Comment
Think:
If companies manage to get AI that replaces humans in each of the existing work spheres... how are people going to earn money to sustain the capitalist system?
In a certain period of time, the domination of AI in knowledge and work will force humanity to enter a new system. It is up to us whether we begin to prefigure it as a humanistic system where everyone would fit, or a dystopian authoritarianism.
youtube
AI Governance
2024-01-03T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzbn6a_dwlmjolXoL94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZbTEQqncUU7eMDiZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3nFHU5vnBwMHx7Oh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwDmteD6MITnX0p-Vp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyq94my2nvvbl6S89V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgytGcExoFcvXVFwLVl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4txTp5ZnpGYfAost4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZ7y2YwaLVeycwMBV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4gMFXd6ZPfLWnLjh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzqV8LylRk_ZCLlB-R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]