Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And even if they did everything right and got all that data ethically they would…
ytc_Ugy6R6eO-…
G
If AI was this good in 2019 we wouldn't have had to suffer through GOT S8.…
ytc_UgyTTN5yM…
G
Dude, your mouth is out of sync with the words you say. Are you an AI program? …
ytc_UgwG9T2kz…
G
who cares if he deepfaked people... like if it's just for your own pleasure and …
ytc_UgwoHyfRt…
G
This sounded plausible until you said AI told you to look in the bible. Sure Jan…
ytc_Ugzhvm80m…
G
this argument is dumb cuz we've normalized digital art for a really long while n…
ytr_UgwJcNMvv…
G
The No AIs will keep multiplying as they realize ChatGPT can expand into their j…
rdc_jhclve4
G
Ai needs pre-collected data sets to learn from. created by us humans who can cre…
ytc_UgxBixXAO…
Comment
If robots produce everything, why would you even need someone to buy things? People who own the robots can just exchange goods as needed. And use those who don't own the robots as slave labour.
Universal basic income is a neat idea, but if labour is no longer needed, what are you going to use as leverage to make it happen? Nobody cares if you go on a strike if your output is not needed. If people rise to a rebellion, the owning class will just squash them with automated killer robots.
youtube
AI Harm Incident
2024-08-13T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyMAqgUbwBrkqAIAIV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwDmLGnNb9A2MmzpId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPT8OGmrDkrMUmkHV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPEVbllg8q0ZZJmN14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy0bf57hLyZiThKZcl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxySjVvxSlo3lyIfut4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_vByami7VooCO6dV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzGyxTdYCpm7_5XGXF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyDuu1OJHhDouB9ssR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzEzP5BipGyhUt_JaF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]