Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't worry, the future is in good hands! Our AI models are here to assist and m…
ytr_Ugx4IMW92…
G
Mark my word, itʼs not AI, itʼs people behind of it, AI is a mask for them!…
ytc_UgzKSASyL…
G
can you even call angel engine ai? the images are ai, but the animation style se…
ytc_UgxgQyrHl…
G
Automation means one of two things: 1) fewer jobs making/doing the same amount o…
ytc_Ugz4JP2iK…
G
AI lacks nuisances but it's probably smarter than some profe$$ionals I've dealt …
ytc_UgyBRI7FM…
G
Unless we get more alien tech from crashed saucers, I don't think we will have A…
ytc_UgyCzxknn…
G
What about humans developing and evolving in ways that are not being measured? L…
ytc_UgyTO5xHt…
G
Ai ppl:I use AI a lot so I'll be an artist!
Me:Take a pencil and try to draw…
ytc_UgwvT6kBp…
Comment
It’s intentional that the development of AI isn’t “safe” mainly with US developers but the EU has made attempts to regulate AI early on. 3:09 This explains while privatization and the increases of monopolies will allow AI to remain less regulated.
youtube
Cross-Cultural
2025-09-28T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxPqjz-blhDdyZSn1R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyW9jdPaEp21Ml4uIF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVlQMcqojc39nvkQJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugysg7XSAWsAvlh10dt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgywjBrcse2ow-ALcWF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyf7IRJXxPIVD_nbaJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyD8ErZSi5lEGFhlGd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDvmGEWPBYcQN8DJV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPLzcb7souZeXFpLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxDyMTboZgBJRy7lP14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]