Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ive actrually already had these conversations with googles A.I gemini... talked …
ytc_UgyUzatLO…
G
This is literally so stupid. If you have an iota of knowledge of how AI and ling…
ytc_Ugx2Y63SM…
G
My question is, can we teach A.I. to feel pleasure, have desire, and have purpos…
ytc_UgxcOqq2r…
G
no one would watch or listen too ai sports or music or at least i wouldn't…
ytc_UgzsOgxxS…
G
Plot twist: This entire podcast is AI generated. Neither person exists. The AI i…
ytc_UgyY3eDMd…
G
i'm a hobby artist and like. ai bros. dudes. if you enjoy creating art you can d…
ytc_Ugx0DYoNE…
G
When large scale companies like IBM and Microsoft ask for regulations, you can b…
ytc_UgzXLMSuk…
G
I would mcuh rather have men like Thiel, Karp, and Lonsdale lead the way than po…
ytc_UgxdEUg7z…
Comment
Mofo’s watch way to many sci fi movies lol, AI will never gain consciousness, it will never make good nor bad decisions, AI is purely a mathematical engine and at the very most can simply replicate emotions through pattern recognition but by no means would it ever be able to initiate any type of emotion, that’s simply impossible on a technological standpoint, most of these speculations are purely fantasy and human imagination, AI will simply do whatever it’s programmed to do, that’s it.
youtube
AI Governance
2025-12-08T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxMaONjkdb1fTusFJV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxh9C6p8IF8RDdukfl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5YEXyK_Hr6lEJLsR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzx2-ovee5Jup-_e8F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_CIJ9q7tQjMF-CjR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwi1-5o9A-XUfWD8Ox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgBDU_pygKKgdPzV14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJ1mRt3-Ff0N0E_gZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwACHNuX4tdxvC9Fnt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzCxHEySWFj7Ef-Ql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]