Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There can't be "AI art" in my opinion.
Of a human isn't behind it, what's the fu…
ytc_Ugx5JvdR8…
G
A.I. freaks me out. I guess it's my lack of comprehension on the subject. I feel…
ytc_UgyMP21sY…
G
But in your past thumbnails,I thought AI was replacing engineers😅.Thanks for cla…
ytc_Ugw2yhgZV…
G
The A.I. scare is history repeating itself. There have been MANY scares in our c…
ytc_UgxiVOBFV…
G
Middle class yes, working class no. It’ll just be working class or elites. AI …
ytc_Ugy1YUNKI…
G
If some one saw my ai chat I'm bloody dead man the fbi are after me…
ytc_UgyvuvZWz…
G
Just as an fyi to artists who are concerned about style mimicry, there are tools…
ytc_Ugy6vC222…
G
I was in a Waymo on a 2 lane street and it slowed down for some reason, at first…
ytr_Ugxznmiq5…
Comment
I feel this ultimately is about finding meaning in your life.
Why would AI choose to kill humans. We are not at the same intellectual level with AI, how can we comprehend that it's only solution is death and destruction, that's a human result. It sounds to me the real threat is still other humans.
If I have all this time and AI makes life easy, what's in it for AI. If it can think for itself, why would it choose to do anything for us? It could decide to leave Earth entirely.
Also, does our most advanced AI know it's in a simulation? It sounds like we are trying to create a supernatural being.
This argument is endless to me, and it all boils down to one question, why?
youtube
AI Governance
2025-09-05T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxRCNf7iGX-Q6ihgp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxB7V-AAEXABYtCZp54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzkLdbSwH3TxmteiJh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwOZBw31wfLBqNrWJZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyJTHUu1jPzlRxYJoV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0yCcBvEQ528UcMcp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx23fgMhxzjkjJS7dp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeHe5CH8EQHjdSlwZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx9NVRD5Mb7H5NIz6J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz-7jOWrYHphDHW0OV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]