Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok I have an argument why it's likely that AI might not want to kill us all: ali…
ytc_UgxgrK6C2…
G
Nah nah never that won't happen bro 😔 stop us doing to stop sun radiation and st…
ytc_Ugz63BzYt…
G
Today humanity has only ever painted pictures of artificial intelligence taking …
ytc_Ugxk9U99e…
G
The only common reason for ai robots or killer robots is for the massive populat…
ytc_UgzN2Tnsw…
G
The more we make Ai like us, the more we will be able to see ourselves in them. …
ytc_UgzovkLcL…
G
Wow it must of been pretty icey 🧊 and slippery on that nice warm summer night be…
ytc_Ugyt68F1q…
G
In the topic of the AI mind. First a premise, the mind and the brain are two se…
ytc_UgzvMp8dI…
G
The brutal truth is In five years, an army of AI agents could run the entire com…
ytc_UgxFWHu8B…
Comment
Before blaming technology, pls remember humans have always exploited ecery mew inversion for their own evil. Pls understand that AI and automation based human crime will be executed. Pls consider this.
youtube
AI Responsibility
2025-03-10T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyKVnfhLKu-VZLAnH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwF9QWarB7sxn_NIQ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxd97_qYvAS9j0Zb194AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4l1rmO445akX65Ch4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgybhSkYG9y6nqM1ZVR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWha85AzikOhr9hSV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxwsFcG1MRFY87iss54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTn6hk8yf9laFw0Bd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8G2InHgbXsXf949x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5QWzAG_RAN1ZPPUd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]