Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@DaVioletShark Human brain just automatically corrects the word if the first and…
ytr_UgxKkiCdp…
G
We already have this for years. Why we got Facebook,Twitter,Instagram,Google and…
ytc_UgxaoC5DM…
G
If about half of these sites are AI, then you’ll be forming your opinions based …
rdc_m5zily1
G
The robot in the Middle with the Hat on looks like it's come from a laboratory i…
ytc_Ugzp7pQ6G…
G
Yesterday a robot that was being developed by a Japanese company killed 29 peopl…
ytc_UgyoC6kxt…
G
This was in the works for decades. And just recently the jobs report just announ…
ytc_UgxGpefLf…
G
55:06 We release models openly, becaause they are mostly harmless.
They can't d…
ytc_UgwudaEM1…
G
LLMs should be chopped up into doing speficic tasks in specific narrow domains o…
ytc_UgyN8lUbm…
Comment
Imagine we have an Ai unit fully integrated into our internet system backed by some huge cloud provider with access to as many historical and political data. Some intelligent human being decided that Ai should have rights, now imagine some intelligent person goes against it's rights.....
youtube
AI Moral Status
2023-01-13T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_Ugx6gKpWgPKh0fZApld4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4bfx3Et6vYkUGw6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSOIe6chhvQsGOXs14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyg1UaHKNstX1p4Kgt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-3S7rE47A4rOZQqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]