Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Genuine question, but you mentioned how these were run through Nightshade in ord…
ytc_UgytbyjQe…
G
Approaching AI not as a technology but as a science all unto itself is what I th…
ytc_UgxpiNVBe…
G
One could write a story about the vain struggles of man vs ai. A long and hard f…
ytc_UgwwM7vyI…
G
1. Intellectual properties should forbid selling their images with AI.
2. AI sh…
ytc_UgwvNI8JZ…
G
Self driving cars don't get , drunk, or fall asleep when they are being used pro…
ytc_UgxP0aa13…
G
@EyehatePersona5 Guy in the video was using basic autopilot, which is only meant…
ytr_UgyBQ4XOb…
G
The logistics of training really show us that we're nowhere **near** human-level…
ytc_UgxA85KcJ…
G
Ironically this shows a great use of AI art for actual artists through the creat…
ytc_Ugw7gO0pI…
Comment
The data ChatGPT uses to train on is curated by the people who provide the data. ChatGPT does not use information collected during sessions with users, unless that data presented as training data by the developers. If the people responsible for curating this information isn’t filtering out personal information, it isn’t ChatGPT’s fault. ChatGPT is just a program that has been instructed to train on the data given to it. The people feeding ChatGPT are responsible for making sure the data doesn’t compromise public privacy and safety.
youtube
AI Moral Status
2025-06-17T08:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxl_jSHoVn7IzGp80V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzcGa9ySiMYeiYqSpp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyBxJhxZZd16q2LqiF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGa9rPt5_kNJ57RhF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6E4lOOeSfGjHTmzB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJnMFG1hw5EcEPyip4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2lTsLo7F2_KzO3yZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeFvWFeC-O_FuE5XB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxvTBb4nM2ZF_Na-WN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRT8GYTFA8bcrWRHF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]