Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are going to need a new economic system in the future almost no matter what, …
ytr_UgxoFzLC9…
G
I wonder how Dr. Yampoliskiy views the one big issue with AI and that is it's ma…
ytc_UgzLbKF-p…
G
“We’re hurtling through society at warp speed — toward a Ferengi-like world driv…
ytc_UgwX-EOHc…
G
How do you learn to learn when the LLM does all the work for you?
How can you be…
ytc_UgzVGIYl9…
G
The problem isn’t about what jobs should be replaced, its our economic system.
…
rdc_ogpdzaa
G
ai is meant to do the shit humans don't want to do, I still want to talk to my l…
ytc_Ugz_4Ho7D…
G
This just made me want to do my own ai movie. What did they use to make this? 😊❤…
ytc_Ugwdz__7P…
G
you assert that the unhinged murderbot aspect is just a native component to GPT …
ytc_UgxtEkW2i…
Comment
Pretty sure Elon Musk is not a “good” guy.. he wants AI developers to pause for exactly the reason Gates says the “good” guys shouldn’t. Pretty sure Musk would assume control of AGI if he could.. right after signing that letter, he spent 100s of millions on AI development equipment — look it up
youtube
2023-05-22T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxS9bTmjKyrHX0cvih4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNLX_QShQkk2aXIbB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwwBlrDDLIGD9RdZe54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzcdJ74zjPsod_pr0x4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzgi3kiKMfO3aQaukV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyN6iROIv1k2ShXbvx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzfuhHF5go9siA5RZF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZ81B88mO9uIwBkm54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwCvQ3pzYxpEcLUmRR4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-9rhSZbK0uM6X5714AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]