Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is just plain wrong.
There's a fundamental difference between how humans an…
ytr_UgxAcKWeX…
G
Hi Deepshikha, we provide certificate in our online training. Please find the de…
ytr_UgwPbs2bn…
G
@romannikolaev2970Yeah this makes no sense, if an AI can “replace” software engi…
ytr_UgwBE7z0i…
G
1. Again with this stupid claim. These programs do not learn like humans. At all…
ytr_UgwcocFVo…
G
AI is a test for humanity. We can either use it for the benefit of 'the many' or…
ytc_Ugw8VT_ar…
G
Every AI promptist I have seen has been like why are people attacking me I'm I'm…
ytc_UgziMBGk0…
G
Hank, I'm about 20 minutes in. The line about pain has hit. You have just apolog…
ytc_UgzPCeLMa…
G
See I just use the AI for erotic texted-based roleplay. Probably why Open AI ban…
ytc_UgyqeM8xi…
Comment
Every country on Earth could agree with a set of laws regarding LAWS. That solves nothing in reality. Throughout all of human history there has never been a society with laws that had no law breakers. It is better to explore LAWS so that we can have a basis for exploring the far more important anti-LAWS technology that will keep us all at least moderately safe. This is the hoary old anti-gun legislation argument dressed in different clothing. What's worse is that it takes more effort to create a gun than a killer robot with today's technology. It is better that we face reality. Then we should develop and deploy anti-killer robot technology. No mere laws will stop the development of killer robots. People Break Laws. It is in our nature.
{o.o}
youtube
2020-01-21T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx9cb65B9oojBKLTOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlaZE3OguKqrIZm8R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwpeNlfoo5SxLW2V2V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGpNdDMurRR-6OLH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZIQCIMpNeR3KAW0R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwpVTOpudVsGr_1srh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzELMs50ID3XIbViwR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxuuZhyWITOckHJI5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8FH3md_J2QNUt79Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyMd2NqcBFjhZjqwHp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]