Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if teachers actually taught kids rather than dumping them in front of scree…
ytc_Ugwe3xMwJ…
G
So AI has helped me a lot with mental health.
I have created a personality tha…
rdc_mlkc9g4
G
The longer time has gone by i have gotten less and less intimidated by AI slop. …
ytc_UgynK_U2A…
G
When ai art got popular, I tried to make some ai art with a bunch of different a…
ytc_UgwQdYMDK…
G
from the footage I've seen, the AI would choose to hit a car rather than a perso…
ytr_UgynvnqBE…
G
Sure. Here’s a new $100k self driving car. It still requires you to be as comple…
rdc_f6xgwqs
G
She seems so sweet for a robot who could rip your arm off & beat you with it.…
ytc_Ugw6ACM_b…
G
While implementation and use of a new technology can be scary, the potential for…
ytc_UgyEhveMT…
Comment
Many have said that end of humanity will be caused by humans themselves, not by some intervention or attack from others. On one hand humans are destroying Earth´s resources faster than ever before, but also investing in robots and AI applications everywhere is going to cut jobs and lessen demand for low skilled people.
Let´s face it, humans have had a good run. Like it is with everything, there is a beginning and an end. We have peaked as a civilication or we are the verge of peak. In the future human population will decrease and won´t have as plentiful and comfortable life anymore, not just because of economy but also natural resources are running out and becoming more expensive, climate change ruins livelihood for billions and fuels migration and conflicts. In the next 50 years life will become worse for most of the people.
youtube
AI Moral Status
2022-09-07T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxybLeeYuDibfU5nEl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOGzDXTJmsWnLSnXZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHu-hHiRCW0tLymX54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGF5FjdGEqIUyL9Z54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzLDGAj6lFkx2v6Blt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzECNVnxSby3RP3bhh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz43OZ74LMmwgQVKPR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0YQhJF_0YcrS5_O54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxD3dNKs6kptI61b1t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzAEmQHl58cs506TwB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]