Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They can't even pass a Turing test. Ask ChatGPT to explain music theory some tim…
rdc_n7jwrt9
G
Can we stop calling people who just generate and upload AI generated images "art…
ytc_UgzHC54Gu…
G
As an artist, I couldn't care less about Ai taking commercial art jobs from huma…
ytc_UgzRi1Q9l…
G
*This makes little sense. Once unemployment reaches 10%, let alone 15% or 20%, d…
ytc_UgwPKkik1…
G
So, if you ask AI what Buddha thinks about dating apps, I guess AI will guess wh…
ytc_UgzUdXb9n…
G
I think may be the problem is what do we perceive as sentient in an almost "phil…
ytr_UgxNyg2ih…
G
Hot take: I think anyone who uses AI as a center of any kind of replacement medi…
ytc_Ugz6-LtCY…
G
@oxfordbambooshootify yes, but the example here is that Tesla are selling it as …
ytr_UgzgjJVEU…
Comment
Base scenarios:
1. AI is good, too good. People get lazy, in few generation we will turn into low intelligence beings nurtured by good AI.
2. AI is good, but flawed. We will have accident many people will die, or get hurt. There will be people who will retain autonomy from AI. Others will get too dependent on it.
3. AI is neither good nor evil. It will be misused by people in power. Basically, dictature.
4- AI is evil, but flawed. AI will fight with us for survival or enslave us.
5. AI is evil, too evil. It will choose based on benefits people bring how many of us it needs. Rest will be scrapped.
6. AI is banned.
youtube
AI Governance
2025-06-22T12:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyssjSGx5fJRZeZN2J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_fFn0gQ1DcB5PBwB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyWdb9O_ztIl4ssUSJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyI9KiDHi2RD5f_X694AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNwgob76BVEBKUvPR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz-EirbVfYDidiNMS14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwacZJELSeRfDkA9cB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzwDvRwHluNdJKVGi94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyxhRQw8gFE_Estzox4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzm7hbATsQA6Vf4Uex4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]