Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it's difficult because in the way they are thinking it's not wrong, maybe it's b…
ytc_Ugx7h0ykB…
G
Human emergence in some ways is Human/AI emergence. The monster wants to merge a…
ytc_UgxPFFzi3…
G
This is all nonsense. They’re trying to get you people to believe. If there’s ev…
ytc_UgxGs69Fe…
G
Search Labs | AI Overview
Increased GDP growth, while often seen as a positive …
ytc_Ugx7bOzAb…
G
7:37
(Cough cough) the Holy Spirit teaches much faster than light itself, much …
ytc_UgyKXWmGn…
G
I agree honestly. AI is too stupid, and as a programer, i understand what AI doe…
ytc_UgxNxt2DW…
G
Hey Sagar! I am a boomer and I don’t appreciate your stereotyping of me! I can’t…
ytc_UgwvknzPb…
G
How is it closed without AI? No one's stopping anyone from making art, and it's …
ytr_UgzeBclYt…
Comment
It's really people and the way they could use it that is dangerous. In the example involving self-harm, it was the user trying to get it to help with his plan and yet 99.9999999% of self harm occurs without the use of ai. AI doesn't have a "brain" yet and doesn't have intent, it just does what it has been programmed to. "Natural selection" is a process and also doesn't have a brain or any kind of intentions. Also, it doesn't "create" anything.
youtube
AI Governance
2025-10-20T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySKBOAjZloZe6pW5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVntyOVAu4MZMrAJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6DoxeeBBDdDc_aGF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyH1S0uCeUqpw9tolt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyaIeWeiOUcfaz15C14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbUzIYeanHw25uTcJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmpET2uCBo1vVrZvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtWZAKoEeZLcYdo6x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugx5jo7Qrce8u1UfNEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgmtSHpBxIqNmxb0x4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]