Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with AI Art is that while it *can* be difficult to master, it's proc…
ytr_Ugz_UU84s…
G
Le danger de l'IA c'est le non alignement, c'est tout, c'est ridicule de pleurer…
ytc_UgyHXIj_h…
G
UBI is the answer.
Most of the jobs will be automated or overtaken by AI, but …
ytr_Ugw6yq8Bf…
G
Let's see how much money they make when no one has an income to buy all these pr…
ytc_UgxGLF9_r…
G
@macmacgkc1st well the ai cant feel sad or angry, litsen if ai ever is sentient …
ytr_UgxkYGpMg…
G
Well did they check the test group’s ability BEFORE the LLM test? When you have …
ytc_UgxFPJ-D4…
G
I completely agree on this one. I also have concerns that with AI being used for…
ytc_Ugyt2EczX…
G
Human as with all else, the atom split and nuclear energy unleashed can play a n…
ytr_UgxbUQgEs…
Comment
Not seeing the negative side at all is the issue. The birth of SkyNet has begun. What do humans do when they finally realize we've lost control? Nothing we really can do. Any possible thought to prevent something would have already been thought of by A.I. And if it really came down to war like the movies, it wouldn't even be a fair war. Have you ever went head to head with a hacker in COD with their hacks turned all the way up? Prime example on how machines can obliterate even the best of the best on the same team. And A.I can do it single handedly. Now imagine millions of them. Let it sink.
youtube
AI Responsibility
2023-05-20T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyS0o9_IA3K_a2lw4x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsuNTCJ5Cx_lnRkQJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw4TapWQti14gCMhWt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzUYp4hYb5GlcMKN6B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzj-6SVXwSx8DIac4J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxGYlG3PDQmObi9qyl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyqKYANBjDPNvVqQ014AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxwJEA5teBlQc31g-B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgyRQyfB9NVHQI9YaqR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYcd7TL4dXvYjBfiR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]