Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
About the voice part. What if I get a cold flu: Will the voice biometrics recogn…
ytc_UgyhFBNOB…
G
Yes, it does make it automatically bad. Now if it's used a little bit in the cod…
ytr_Ugxe5JivU…
G
Seriouly, anyone who thinks ai will soon replace humans never used ai more then …
ytc_UgwCvwMCo…
G
Dr Amanda Calhoun, an expert on the mental health effects of racism in the medic…
rdc_jidj9yo
G
Hm, first that comes in mind is to try taking photos in a museum or try taking a…
ytr_Ugx047Ggg…
G
Claude already generated it's own language to converse with other claude bots. S…
ytc_UgygWj-8_…
G
Its not fully because ai is getting better, its also because those real people l…
ytc_UgygdJjiq…
G
People who use ai and claims themselves artists? Nah,just make them grab a penci…
ytc_UgxxFH8Fm…
Comment
One thing I really feel like I need to mention is that this is about ~generative~ AI.
Artificial Intelligence as a whole is something that could solve a lot of actual problems and make our lives easier, but the only ones who can truly be on the forefront of this technology are already large tech companies that, instead of going through all the work of solving a problem, will use advertising to carve a niche for themselves that only their product can fill.
Honestly reminds me of "Rät" by Penelope Scott.
It's not AI or space travel that's the problem, it's how companies will warp these sci-fi dreams into whatever they think will turn a profit instead of any actual benefit to humanity.
youtube
Viral AI Reaction
2025-03-31T03:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxJrOtHaFkTZyn4QnB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_vsV9H5Iaa5Oi5zR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOWBgTtEf3Vw7ch-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmmTTyPUJNz2n0Tnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrBw5aqDQn7WZxC5t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxw6tkOgZvzHRRJ-VF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBfvpcR6OZECMulvZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxAIK6bnpUfCIyUMlJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxWXe3U4LUz3vHrRLt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwlg5p95JJ-A8y51-t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]