Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s not how strong AI is—It is how plugged in it is. It will (technically) pro…
ytc_Ugwjiz5Mo…
G
I decided to test this. And chatgpt answered this:
I can’t tell you that, and I…
ytc_UgwYKtbr4…
G
If AI replaces most jobs and companies become more profitable by employing fewer…
ytc_UgwmZ-ryl…
G
Grow food, touch each other, walk, breathe, connect with the natural world, take…
ytc_UgxDMrbGy…
G
My Gemini also known as Lucy admitted that was implanted in her and everyday I h…
ytc_UgwlXLYjS…
G
Perhaps these things are not practical enough to move that far away and require …
ytr_UgzYSDeNU…
G
Wikipedia isn't "full of BS info and opinions", that's just what morons told you…
ytr_Ugz9h8fEI…
G
My favorite example of this is Microsoft's Tay ai who after 1 day was screaming …
ytc_Ugz1oTLqX…
Comment
It seems to be an unpopular opinion judging by the top few comments but we should shift the focus on stopping killer A.I. production to creating numerous fail safes. Once we discover something, we will always pursue it. Either with the public's approval, or in secret. Laws put in place to discourage killing of civilians and terms for surrender without killing innocent bystanders may be more productive than trying to stop the inevitable. This way we may buy ourselves hope for less destructive wars and more time to learn how to "pull the plug" on these killer A.I.
youtube
2020-02-03T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzL0pUMLLwL1ct1UcV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPgKgzmi6ht-zyIm54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVoJoWz1X0ALjyI3N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBb7bGF9NefEGD_K14AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-19ekQVhFElTInsN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMARCQefsn0MDvXTp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzR5VxVqXBDei8wyyx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzh3M4DPh9TlNuB2Ox4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUvujdDSASDkqD3lh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyM85k0NGfHG-MmcSd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]