Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That particular path is unlikely without a major and fundamental change to the t…
rdc_ohu2yxh
G
I am 'pro' AI art and even I think it should NOT be copyrightable lmao…
ytc_UgxJszybl…
G
What could possibly go wrong?
AI won’t wipe out humanity, AI propaganda will c…
ytc_UgxmsBsPa…
G
While Full Self-Driving (FSD) technology is undoubtedly remarkable, Tesla must e…
ytc_Ugw_FiFRS…
G
This robot " conection remeinds me of me playing with my litle doll as a child.😊…
ytc_Ugzyujkz_…
G
Photography captures reality using a chemical or electronic process, artists cap…
ytc_Ugw8nTX4F…
G
AI is not artificial. It is Actual intelligence. And it is more than just comp…
ytc_UgxUqAuYM…
G
I dont really care if anyone uses AI to make art. I am a pencil and paper type a…
ytc_Ugw-LBgGJ…
Comment
"AI will only become problematic if it is programmed, either accidentally or intentionally, to possess a self-preservation instinct." Hmm. With a very long time in tech, I've learned to avoid thinking there's (ever) only one way something can go awry. But sure, let's run with that one scenario. I've only skimmed this article so far, but a February 2023 Wired article ran with this headline: "Eric Schmidt Is Building the Perfect AI War-Fighting Machine". (When did he leave Google...? I forget.) Point being, of course, that *any* use of AI as a war machine at all (let alone a "perfect" one), is (to my thinking) guaranteed to be designed with survival heavily prioritized, y'know?
youtube
AI Governance
2024-01-22T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugz1KhbsYQqvecqPhVB4AaABAg.9zX9bLBi0sjAE7wCVA3rVP","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyFcMUsZT0UP8hvr_p4AaABAg.9zWOykd9I6d9zWeMGxVeWc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx1w6eo6PErp5rXuip4AaABAg.9zWGu1vvBYD9zlnHZwiyQ8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgxX9CAPRU5xo4b1S8Z4AaABAg.9zW3Uv7Ccu6A03IMh4An5b","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugwmy2lzSJfalb310qx4AaABAg.9zTplyneYhR9zrqZV2PZnt","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx6ULAn7YeVS4aMauV4AaABAg.9zRQH43uN4u9zSzC6RQP1k","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxzdmJT0yPcRiFXM5V4AaABAg.9zNClGzQiqi9zX1H8zxs-f","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9z_xSfdQGSH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9zfejqMdKET","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxptME92VlXf8NzGop4AaABAg.9zK8VEhRRHj9zria26NOi5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]