Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Switching off AI if it becomes super-intelligent will be impossible; it is much …
ytc_Ugy_D7CRn…
G
Glad you found it funny! Sophia has a great sense of humor, doesn’t she? If you'…
ytr_UgxZkuCgS…
G
I was automating a process at work to help a team with ML and they laughed at ho…
rdc_j0boepu
G
Hasta ahora lo que veo es una version sofisticada de muñeca inflable. Evidentem…
ytr_UgxN2zSU0…
G
I love this hypocrisy that AI democratizes art and makes it available for everyo…
ytc_Ugyj8pGga…
G
I wonder if the A.I have issues with other A.I and have its own cyberspace war. …
ytc_UgxU5VBjN…
G
There are plenty of times that Claude or ChatGPT spits out code that is way over…
ytc_Ugwf3iW4J…
G
4:50 Google barely cares about human-related ethics, let alone those concerning …
ytc_Ugz8NuVal…
Comment
AI needs an ethics sub-routine or limits, or it will become a petulant child in existential crisis. With any control over itself or humans can and probably will be devastating. The warnings from the best scientific minds and experts and even Hollywood have been ignored for decades.
youtube
AI Governance
2023-04-18T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzD_LAuu-Che3zWPwF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxW6-XI-bjv_RynZRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx07ASufsKdSMXrXUZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5PnGQJGA82icDFoB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwcIlD7TPQO7PsGhc94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx82VX3iD-V6pwxvJt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzk4j19XeKbEYP71954AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz8QjJezaG9z1hsWQJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQnJlCS_dtv8bU6OJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugx8VnJfUH66d_4s4t54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]