Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
10:43
I once read a tweet that said something alongthe lines of:
I want technol…
ytc_Ugwc6ogqn…
G
Nice way to tell us that you have no idea how to use AI.
Current SOTA models ar…
ytc_UgxkqTHcs…
G
The other thing to consider is most artists hide their good stuff behind a pay w…
ytc_UgziDDhxx…
G
I do understand, but I use AI to make unique creations no one else is really doi…
ytc_UgwJchv2f…
G
This moment when you realize that this robot (with a superior A.I) could erase h…
ytc_Ugjb84FZu…
G
He is talking nonsense about risks with AI ! And that gives not a good reputatio…
ytc_Ugw7A_VuP…
G
The people behind AI are those dipshits that would bug you on the playground, na…
ytc_UgyKSM3w1…
G
It's kind of bad. You can tell when the robot is running, the legs are moving an…
ytr_UgwE_b5QJ…
Comment
The whole a human does the same thing that a machine does, for example learning and taking what they learn to replicate it, is complete nonsense. I get what people are trying to get at, justifying the stealing of artist hard work that took years for them to master. That’s why I always find that whole argument stupid. There no way you can justify it. When humans attempt to replicate they will not 100% recreate the piece a human can implement there own personal touch or be inspired by another piece and implement that into the work. Humans don’t seek copy rather imitate what they see. That’s what makes human work human. The AI’s work will never feel human unless it learns to add its own personal touch
youtube
AI Responsibility
2023-06-14T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqkUifUKROcJyt8IZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw6j-OhdcSnVcw_MFl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFiD2nquVi-MNyeQR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxT7TJwp56WcWoQgrR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz3UyAmzU8jRgpitTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzwutzq6iolP6Gd5sZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxhs6wzVmSWxvQ6F2d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugxfh72tjmSKrsXDYM94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyeIQAPBK_VM7d8JBx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwjhnUaB75sSNU9aZB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]