Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Before opening this video I got strucked by an Ad from Base44. They try to sell …
ytc_UgwBnjdb5…
G
This conversation sure turned away from AI, and turned political in an abrupt lu…
ytc_UgwPMi7E9…
G
I do believe ai will become sentient one day maybe not in 40 or 50 years or even…
ytr_Ugwd6gYp8…
G
Right? If I recall M-5 used the Enterprise to destroy the ship they were using t…
rdc_m12n6pj
G
It seems to me that the sexism problem in NC is so cosmically huge, that it's go…
ytc_UgysaM4Dt…
G
I take the AI "supporters" with a HUGE grain of salt. The tech industry is spend…
ytc_UgwtShXJE…
G
You can travel now. Outside of a small window last year you can do just about an…
rdc_hm97x03
G
While this is an interesting conversation, there's a fundamental misunderstandin…
ytc_Ugw6OEMWK…
Comment
We don't need actual AGI to destroy most of our jobs. And the people who are making decisions about the scope and scale of "AI" are, in fact, trying to eliminate most of our jobs without bolstering any kind of safety net. It will be like the apocalypse of the horses, all over again. Your robot maid, in today's world, leads directly to people starving to death.
youtube
AI Moral Status
2025-08-17T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwbn7Tls4SKwsKIV2F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyWwrFOxTcyg72MbyN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxfWvPG-99GFEnOKS94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7onaB50SlPiC-2jN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzmmtgaxN_QSW9oy_V4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwu7s1SpTxrjkN7IpJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5QHhgBYj9KIl7sMx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyp0VC0XfGlmOjKkVZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyAhNw2ipoZl5UwHeV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7Z91zqJwFx2PP-ZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]