Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
By that estimation, the majority of jobs will be automated in half that time, 20…
ytr_UgzI3Liml…
G
Don't expect too much from this free version Guys, instead try the 4.5 nd you'll…
ytc_UgwHrST6s…
G
I don’t trust humans, why would I trust a robot, they are capable of doing what …
ytc_Ugy0HsEc7…
G
Didn't Isaac Asimov solve most of these problems in 1950 with the Three Laws of …
ytc_UgxSaXpuX…
G
idk why people doubt its real - i've had a stalker for over ten years now who ha…
ytc_UgzR0nTRe…
G
I genuinely have my doubts about AI. It's clear that it still has the fingerprin…
ytc_UgwzsvT0R…
G
Does sentient AI deserve rights? Beyond its right to live, what other right does…
ytc_UgwkWhahd…
G
> Russia produces twice as many tanks as the 5 top European countries combine…
rdc_mcqvmtw
Comment
I've never heard (in 15 years) Eliezer depict an actually broad general AI interrupting with the desires and wants of humans. It's always some kind of narrow paperclip maker with no off-switch. I think that's where the arguments breaks down for most non-technical people, is the contradiction between "smarter than human" but also "extremely dumb narrow goals".
youtube
AI Governance
2024-11-12T03:4…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzhlDX1csR8XkjK9iJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx01-JRoygImPi2oB94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-B4NOFCx3uGYQj8l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1XS_weEDEdybQWnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBjl1hpXUD7IOFfKp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBmtcWIE08QHMHQCd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwXAnBZ0P_QQPV-kR54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy34WB0Kv3W8h45zpx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyzG9twp3oIzLyBuHp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxJglRewQqd0ucvVEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]