Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Popcorn time. Been having loads of fun with Dall-E 2, guess it’s time to experim…
ytc_UgxYBKKto…
G
AI is programmed by humans 🤔As humans our driving force is to survive. To ask AI…
ytc_UgzDWzgEr…
G
Yes, Asimov's laws immediately came to my mind when these concerns first emerged…
ytr_Ugzrlpz7h…
G
He obviously has watched too many sci Fi movies as a child. We created ai and it…
ytc_UgyO1AfNH…
G
So frigging cool! Reminds me of "Bishop" from Aliens 😁. Just needs milky liquid …
ytc_UgyvcidFe…
G
@diyr791the employer is can choose whether to hire Earth to purchase self-drivi…
ytr_UgyJAwt9b…
G
That is why it is being offered for free now so that we can test the limites of …
ytc_UgxDjDCzk…
G
You thought the pandemic was over but AI taking your jobs is just phase two of t…
ytc_UgytmXfKI…
Comment
According to Yan LeCun LLMs and symbolic understanding aren’t sufficient to knowing “reality” and LLMs fall behind a child of 4 in that regard. He is not a naysayer but has some interesting shit to say: https://youtu.be/RUnFgu8kH-4?si=gi--ZSMAATKZ0MXr
youtube
AI Moral Status
2025-03-17T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx-g0JhG6kO3GTnRTF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzVsyMx2ubmfwRmPZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyOwIDeJ5o7etc5Czh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_Ugzap8Gn8h24MuCDHRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw9hZ11V7i1pSRz8ex4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUqbBJACD7902_hi14AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzGaRixydj-yPm3W2t4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLZpRcgfvKUtEaEp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxczyGIN8rOQsIepaB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtIqZSDrF-TlrCbK94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]