Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I Thank Palki Sharma from First-Post for bringing this extremely important issue…
ytc_UgzEA095N…
G
We're glad you're noticing the improvements! If you're interested in seeing cutt…
ytr_UgyegAomq…
G
We need to drop ai for a few decades and focus on our humanity and arts…
ytc_UgyG8bIvE…
G
Your proposal for addressing the crisis caused by the development of artificial …
ytc_UgwWAYmZW…
G
Artists have often been overlooked and undervalued. Unfortunately, pursuing a ca…
ytr_Ugxn39hqY…
G
Idk about this. Fan art itself isn't legal, but it is considered promotion so ma…
ytr_UgxXMb_Bt…
G
I would think with the intrusive access to everything we use already, this is ju…
ytr_Ugzc_HNPm…
G
Fake fake fake 🤥 ...it's bs folks unless your job is so redundant Ai won't repla…
ytc_UgxiFWKBw…
Comment
It seems that Elon doesn't understand that the type of AI he refers to, is not the AI people are developing, ratherly he refers to a certain type of concious agent that would possess computing power of AI people are developing. The thing is that the ghost in the machine, oftenly called singularity, is certainly something we can't in principle create no matter what kind of tools and technology we have, because that would mean that we would be masters of the universe in no time. Just think about the fact if we could produce self replicating living cell from crude matter, how far would that be from a concious agent that emerged in the last few seconds in the 24 hours of evolution.
youtube
AI Governance
2023-04-18T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwJIQWH2jl7IbZayOZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyX2CIj0Kbr9uRX7d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwS52u1v0DRrAl2ec14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugza3zzRTM8QWStvFDp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyT23MzcJ-yre4hLHV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1dMBVvhXPvC4BNEB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgziWOaTSjVb4GzmXaJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgznkJbPzfZcxx9ldml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWHu3z4QKbrPWyLhB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxm1WtIYn8kbO6uN3F4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]