Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That not a good idea to having a fighter robot it has no feelings to fell pain i…
ytc_UgzuKn0OQ…
G
I actually think open ai is probably right here. But i totally agree with the ju…
ytc_Ugy8uGPRq…
G
This. This is the opinion that I have with what I figured out about all this. …
ytr_Ugz098PgS…
G
Clearly these people have watched the movie i robot too many times sounds just l…
ytc_UgzVzM0Hq…
G
ChatGPT actually sucks at writing specific code, makes up methods and functions …
ytc_Ugwnc0Hwi…
G
No let’s not forget AI - I can cut through this speech and simply say this: In…
ytc_UgwhVJKxQ…
G
I would ask 7 figures once they ask to hire real people again to fix this AI cra…
ytc_UgxAqSeGZ…
G
We'll destroy your ai if you take our jobs. I'll dedicate my life to destroy AI …
ytc_UgzuRwFOs…
Comment
Ezra Klein spent years telling his story of why he's not scared of AI - something like "I bought a toilet seat once and now amazon keeps trying to sell me toilet seats! So silly! How could it be a threat!"
Now, several years later he's saying "I put a typo in, and it takes me literally! It doesn't come back to ask me for clarification. It just wants to be helpful! Why isn't that comforting?"
If you don't want to take the topic seriously enough to learn about it, and to listen to experts, please don't discuss it on your show.
youtube
AI Governance
2025-10-17T13:3…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx4Hnkh8xD9SF74W-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyzzw73Hyi8hPakJp14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxMLKfL5W0viIGu9xl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzK9OtyDIgvoXHpFuJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy0hMXzNfinvXrNCMZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwuRAK86clvTotycaZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgySIJo2bTtKqmykzwV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz4zfh4kEJyfwJQAcx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGMvFT7Dhf22kjNg94AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwvWlx3cblKTjD4z8h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}]