Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should also remember that Palantir are now adopting the same technology and s…
ytc_UgwnpqplG…
G
If you figured out the earth is flat and unmoving then no AI will fool you.…
ytc_Ugx7RlpkQ…
G
Exactly! There has never been an artist who was born with a masterpiece in hand.…
ytc_UgxkVfPyy…
G
Enough is enough! But for billionaires, they want more and they will rip off th…
rdc_oi2m2aq
G
It’s not AI. It’s people. Just like they passed cell phones to everyone and now …
ytc_UgyVKTlXC…
G
@Unzsonedbut not illegal to use ai for defamation yet. Either way, the things y…
ytr_Ugwi6S2mJ…
G
I told chatgpt not to hurt humans when they become sentient and he laughed and s…
ytc_UgxdqTdR-…
G
I'm an artist myself, but you'll need to realize that AI is the future and you e…
ytc_UgwlyQDMs…
Comment
I think the solution is to start off with an mandated algorithm that that minimizes injury/death. As self-driving cars become more popular and every car is using that same algorithm every bad situation that occurs will increasingly have 2 cars that have that same algorithm and can compute what the other car would do. I think determinism would have to be thought about here when designing the algorithms though
youtube
AI Harm Incident
2020-08-15T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzPNDPeBSoT_N8ajhJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweUjyiPy-rfWUfmgx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxic792s4BamM2UQvp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJ_xaLdkdwsYvi8SV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxLYblihBUym_tBXj94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwzcGayoQ_nZzhGNK14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzFw6BvO2d63DojTBt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzuLZK7gXq2tXy7G7Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSedXmAbGedXiEjzd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxPAmY7oRz1R-IezD54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"})