Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Once you implant an AI with the concept that a certain number of human lives (X)…
ytc_UgyXNK7xu…
G
...you know most people who use generative ai don't just give a 15 word prompt a…
ytc_UgwwQChUk…
G
Big tech bet on AI against developers instead of AI for developers. That’s not d…
ytc_UgwBiDIOi…
G
Better yet. Don’t ever use AI. You can never be sure what it’s saying is accurat…
ytc_UgznCjNFi…
G
A.I. isn't being sold with Christianity in mind. The reason Peter Thiel is refer…
ytc_UgyA8Bowd…
G
even if ai replaces animation jobs, im not going down without a fight and will c…
ytc_UgzUYsPrz…
G
I would NOT put my child or grandchildren in daycare with AI "caregivers."
Does …
ytc_UgwonINyR…
G
First, you may want to look into the verifiable flaws in LLMs that even OpenAI a…
rdc_nkeweq6
Comment
Holy shit reading half of these replies makes me welcome AI.
Seriously some of you are just so dumb but you talk with so much confidence. You idolize people like this without understanding what they actually contribute or the scale of their ability - to you they are some capitalist god who can predict the future.
If AI stops stupid people from making stupid decisions I am for it but it won't be used to make key decisions - it will just be used to replace people's jobs.
Half of you don't even know what AI is. "They are working together. That's why they didn't care". "This man is so smart, his eyes tell the story" lol so cringe. My
youtube
2023-06-05T03:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzC7qOEUxGO23LxmFt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxPmBSkGOrURM6GNu14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxw8IlQcb_Fp9YGi0p4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6rznhrn10VBOXE1R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzkYMz7M28GoHfTswJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwNoD37PjnBkrtapkd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzPpCumwWgweyk2yMF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgL4kfV0rDZ5Iy5GN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw9pYIvo5sJOaLVoSt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyERv2GQBoRIdu9Ynp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]