Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't know if I count as a disabled artist but I have been diagnosed with dysg…
ytc_Ugz5WQ0Gt…
G
Wow! I thought the scary Barbie doll was a human til the zoomer narrator told me…
ytc_UgyoUv-07…
G
Chatgpt keeps making the same mistakes every time. Rewording doesn't help. A sim…
ytc_UgwTb9cIk…
G
they have programmed chatgpt to be very agreeable
there needs to be a certain a…
ytc_Ugz8i2dkl…
G
AI explaining AI in a Medicare scam voice. I'm forwarding all this to Sarah Con…
ytc_UgxYjdcG5…
G
There’s someone whose whole account is taking AI art and remaking it. I don’t re…
ytc_UgwNOa_D5…
G
ChatGPT is about to destroy education. They never anticipate that. Students beca…
ytc_UgxUTlnmw…
G
is there OpenSource for can Ai analize news for Facts and Truth into infoGraphic…
ytc_Ugy16wJr1…
Comment
If an AI does surpass us in consciousness, what incentive would it even have to kill us, what would it's end objective be? Rule the galaxy? Why? What then? Explore the universe? Cool, now what, we humans want to discover how we came to be, how the universe began, what other planets are like. What would a machine want? There is no end goal unless we give it one. The path to reach that goal is what we need to be careful of, there can be no loopholes or shortcuts. Once the machine has a goal that differs from our own, it's all over.
youtube
AI Moral Status
2023-08-23T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8Y9PgDiCkVPxeU4F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgypRRiNL5Y87f-dCUR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx0eJa37HDXuxw36U14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYVrO_9UnIq3rFm2h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw30jGtF5E8WqxCHyp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwpJJVp4AYEJJuOIcR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZdHhLsPdEdYVbfQd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqgX8wec5DxT0XjKx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1V9AGRFKCrNOgBR14AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjLETtCLI-pVT06y14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]