Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an artist 🤚only a real artist can illustrate emotions on canvas 🤚not ai and …
ytc_Ugx1edNcR…
G
Sure. We can't automate everything. Until we can. At best, you're demonstrating …
ytr_Ugw5zT_gA…
G
Interesting that they didn’t bother to give her arms, but still ensured that she…
ytc_UgyfI-4sm…
G
The other frightening thing about AI is right now is the worst it will ever be i…
ytc_Ugwp2bGwv…
G
Everyone needs to play "Detroit become human" that game predicted the future of …
ytc_UgyBRD_WB…
G
I use AI daily to build new bots. I'm getting older I can’t remember code like I…
ytr_UgwT8rmq4…
G
It's certainly more realistic but far from "hyper-realistic", it's still clearly…
ytc_UgwUewDVA…
G
Ezra Klein is a giga AI doomer. Maybe EY just made really bad arguments if they …
ytr_UgwgVNJgS…
Comment
Eliezer Yudkowsky says the same thing. He’s been in AI safety research for 20 years. He said in 2015 there was some A.I. conference and Elon and others with money were there but he said he gave up on elon being able to or wanting to do anything to slow it down. Eliezer seems to have given up. Almost quit work. He says anyone in AI should find another job. There’s almost no way to stop this but we can maybe slow it down and enjoy our last few years.
youtube
AI Governance
2023-03-22T04:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzAhrSdO4H44TncECN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6RKdGMnKrpu9pF3N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyJDZwx0pbwzqE8Xpd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzud7eaKgR1e0rDrTh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw0HhRTSw1E29mV_DV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2tNJpfzbTqfT8X0B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzcwhO9eRFBxcgvNbF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy5hl5uVhwVhess-cZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzeTtmM3Rne0CDgi8d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpgFHiNT0e9p1D4jJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]