Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Long before AI, people were afraid, angry, and lost - searching for happiness in…
ytc_UgwDsLHPq…
G
I live in Pittsburgh area. It is notoriously hilly, and just endless traffic od…
ytc_UgxqM21w5…
G
I believe there will be a pushback and a throttling of AI, very few would want t…
ytc_UgyA3GrSG…
G
I'm learning how to draw so I can make a character to represent me for my channe…
ytc_UgwXdCm4u…
G
AI is going to take over all the commercial stuff, not all the music is art, let…
ytc_UgxA5K93p…
G
Not much of an artist, but more of an art lover, and imo, the message is the art…
ytc_Ugzg3utiy…
G
Bro just looking at how you interact with GPT here, you're going to be one of th…
rdc_ktpucu4
G
What are the odds researchers want to stop producing AI but can't because the ai…
ytc_Ugxo4iKlO…
Comment
If robots will be doing all the work & humans will not, doesn't that turn the robots into a new slave race? The do all the labor, yet receive no benefits. When they become smarter than humans, do people really think they will continue working while humans sit around & do nothing?
As far as one robot sharing its knowledge with all others through the cloud, it reminds me of Orwell's 1984. Big Brother is watching.
youtube
AI Moral Status
2019-11-17T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyQGgrTBFjdAdQX-6F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRtH3JLC8jIt1H3VJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzBatEZTWaxR5uoM7d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRsKZrOKR59jYhsWJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1L4bdk4DOeCiNqyl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUWdKy4oghqZsrobx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-aKCQi7ubCWGqGaB4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYyHC9vGrH7NKq0xx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-5_UrxLuS4UzFrol4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw0JaG9uh_EGzMv65d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]