Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why I'm poisoning my art by the way. That way so that if AI ever tries t…
ytc_UgwyBQ3-D…
G
Seeing all the misdiagnoses all around, I am willing to take my chances with an …
ytc_Ugy7vgR-p…
G
Let's regulate AI to the point it barely feels like we have it! In the meantime,…
ytc_Ugx0jK-tw…
G
FIRE AT WILL!
Robot: i dont wanna fire at Will. Will is a good ol' boy.…
ytc_UgzrxkwhP…
G
I read the METR source around the 1 minute mark that claims to show the maximum …
ytc_Ugwy955sT…
G
@atlasfeynman1039 So the only thing the superintelligence AI need is to break th…
ytr_UgwXNLVUB…
G
Nomi is extremely diverse in its selection for base avatars - and even then, you…
ytr_UgxYISLkd…
G
I was today years old when i found out ChatGPT is actually a stoner working a mi…
ytc_UgybFvQjx…
Comment
Everyone seems to forget AIs biggest flaw. Once the power goes out, it's done. That's the simplified version. There has to be many things that have to happen to get AI to take over. They need to become completely autonomous by having a body to create fuel to keep the power going without us....and that means alot of robots would have to be made with specific intents. So unless there's a secret automated facility ot there facilitating this then the best the AI could do is have the ability to start WW3 and end its own existence in the process
youtube
AI Governance
2023-07-08T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4ln9Yw3FYWIOWMHV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyvZPzsWd73zjmgGW14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoGxGmjDa_9fRaNUl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwmcDLFqzIEvBVrpxl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyfFsV_QFTYUmylSel4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxpgsd9jX02JrMTj7B4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxc_hTFU4UecOS-XKN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzyxIuxiaxcy4-0X5Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3350P8893k-gK3aN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyMkUEEUx0KQog2SHB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"}
]