Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly trying to use an AI detector is a horrible choice to check for "plagiar…
ytc_UgxxI-Kf9…
G
All I'm saying is you morons better not be bullet proofing these dumbass robots.…
ytc_UgyhcoqSL…
G
"It's so over for artists"
Artist here. At 8:58, we can see that the AI altered…
ytc_UgzFTCVIw…
G
"Ai is the future!" Nah companies are getting too desperate pushing it onto a fa…
ytr_UgyfGjZl0…
G
Well AI is not concious....I'll go back to my old saying...If it looks like a du…
ytc_Ugx3DF8jO…
G
If you think face recognition is going to wrongfully mis identify people of colo…
ytc_UgzG0O9bQ…
G
I think we need to stop using human terms to describe what ai does. No, ai does …
ytc_UgzIz6Q7x…
G
Man i wished ai art didn't exist now We're getting called ai artist when We're r…
ytc_UgwlhUF1H…
Comment
This is amazing, in a short amount of time and how far we come, I like to see this in a dark room and a hidden camera all alone and let them chat with each other, I bet we would not like the answer they say. I am for all of this but humans must be in control and a global off switch ready in the cloud, before the shit hits the fan. With are current drive towards tech, it only natural that AI and Robots will out perform the human race, Just look at the evolution of the PC? If we don't keep this in check, it can and will get out of hand.
youtube
AI Moral Status
2022-11-05T11:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyMLClJZr9zziKHsOB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfeNQ7ZiqlrXiNjOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXDZhmUcG3ORGUc_14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgymI4NBwGgCBJWs5F54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7mv8CDpMRt7CmxtN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyncpjAvuqq57oWImt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzc0qa4fBHo0x-a_bV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiEFunAJlCgRezqd14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxfi61DfDQRWlGFpfd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHiXcIxl18ntyTF054AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]