Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm a biologist, did my graduate work in computational neuroscience, and this is…
rdc_mzy56vk
G
So called ai artists put in effort because ai is such a shitty tool, that making…
ytc_UgxKi7_Vn…
G
@jamamas370 make art for the sake of making art isn't good? Just because you mak…
ytr_Ugx570hCS…
G
Artificial intelligence has been developed and trained by human evolutionary cre…
ytc_UgwqxXOlC…
G
I don't want AI to replace art and drawing. Instead it should replace the househ…
ytc_Ugw-5WL9E…
G
I just tried it and damn, yeah, it's pretty impressive.
It sounds pretty natura…
rdc_mfhyrxw
G
Democracy was threatened by a pathological liar and moron named Donald Trump and…
ytc_UgzJxzyOV…
G
I wonder if the reporters own the rights to their personal images. I imagine the…
ytc_UgzPojNyy…
Comment
Assume Very clever people will apply AI to most pressing problems:
- The environment and climate change
- improving productivity, economic efficiency and remove mundane work with automation
- reduce crime and poverty.
- get rid of human suffering such as war, disease, starvation and mental health problems.
Wouldn’t an intelligent agent to solve all the above problems, on the basis of pure logic, recommend to get rid of (most) people?
youtube
AI Governance
2025-06-25T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzjHhZP_sVQ-HsYUUl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxieAyXjJpKA1Lvk-B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSLHwoPLGzsBwBM254AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz7YRLsoRkibqPlxKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBpFJmWdO9-A2-poJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyDhR7t9dqJu1Mtuc94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztSgqqDkDX4QMI4td4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxgDGyAc-KhT0sONct4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6I7ZG3kfd772bK7l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysNaICClKUlbKPkxR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]