Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pretty easy question tbh.
This is overly reductive but:
Is consciousness a ta…
rdc_ichinul
G
I'm pretty sure, we will creafe an AI that attempts to destroy us.
I'm also pret…
ytc_UgwiWSjUv…
G
Science has absolutely NO clue what the observer is. I could do everything I do …
ytc_UgwXAM-B6…
G
The donation vs. spending debate is real. I use Pneumatic Workflow to automate o…
ytc_UgxGem2Nm…
G
The "scum" aren't the ones being fired. You think the CEOs and decision makers a…
rdc_czl4r3p
G
The ai is indeed just a tool, but it is being misused. We can't exactly target t…
ytr_UgwJM0Tkp…
G
The sometimes nightmarish uncanny valley-esqueness of art done by artificial int…
ytc_Ugx0fSWgK…
G
If anyone has read On Bullshit… generative AI is a Bullshit Engine. It isn’t ly…
ytc_UgyWaZcwZ…
Comment
My nightmare is having a handfull of oligarchs owning the AI, keeping all the benefits, and the 99% starving to extinction. I don't worry about having too much free time, people find what to do and, as long as we still have social connections, it can be a very good thing. For me, the problem is to make sure that the AI is doing the work and the humans are doing the fun. If the AI is narcisistic like and demands our full attention we're doomed! So, we need to train our children to social interact! Not to work (as we do today). And, the social contract has to change dramatically! Either everything is for free and money is not needed or the government and/or campanies have to distribute money in exchange for ... nothing. After all, the AI demands investment but not salary and the productivity is over the top.
youtube
AI Governance
2025-09-05T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyNF0aKgDtCUa1EQrh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4k3Wq4TaazAEiRjp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgycNJHQCL07_tzmxBV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZYbea1OiLTu23y0t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzdWWIT6x_31IhMsVN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWrPw23LWYkMaepQV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxO5XdBZNkSiMmJh3B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNNETeOZ_v97rHpa14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzV3yzjkFTOPzAUw254AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1nSzqyqqX386ZS4F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]