Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
c'est pas pendant 6 mois qu 'il faudrait l'interrompre mais pour toujours. la fi…
ytc_UgzZyK2cH…
G
Now what would be really funny was if Midjourney was able to copyright his artwo…
ytc_UgwlLbulX…
G
AI is fine, just don't add feelings. Feelings aren't going to magically come out…
ytr_Ugjrf2Y85…
G
This video seems to be 'advertising' the opportunities to bad actors. I dont tr…
ytc_UgwY-3l1y…
G
Linus Torvalds, the creator of the Linux kernel, has famously described the curr…
ytc_UgwWH4iet…
G
It’s not about what’s the best it can do, it’s about what’s the worst it can do.…
ytc_UgxRzKwvH…
G
Well, thats settled. Sora's gonna have a longer time leveling up since he picked…
ytc_UgwtKeRLb…
G
These AI data centers are using public drinking water, so I went to a developing…
ytc_UgziHrRs_…
Comment
Keep in mind, there is no skill/ability, in principle, that a human can do that an AGI cannot. AI may cure all diseases, aging, ect but there is no guarantee whatsoever that the elites will share or afford those amazing developments with the then economically superfluous everyday human. As things currently stand regulatorily and legislatively, we're headed to mass unemployablement and a late stage capatlalism dystopia. Many elite and elite adjacent voices are already emboldened enough to express their disdain for democracy and political imput from the general public/working class.
youtube
2024-03-31T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzEGMFXZhLNaD1-4BN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAtsKNaD2s6fwaSPF4AaABAg","responsibility":"elites","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_SK84NbH3S8eA6GN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwux3WNOMZ69bhSrFx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxlNvjGB6xA0h0rE754AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZX-ZrM6bVhHqvbCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx3ss3DWn4cPO5RSHN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwRXW3wm65cpzlFTZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0FXQAgzsxgq8Yz2B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykeaE21rwGWZSqIuh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]