Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It will with time. AI destined to rule us all at some point. I am already accept…
ytc_UgxOwypvu…
G
Ai is so cruel how could they remove the limbs from an innocent cat and rip and …
ytc_UgxutP9NT…
G
Does AI know we’re talking about it and predicting it may be the end of humankin…
ytc_UgxXtdURy…
G
Thank you, Ethan! You took a bleak situation that was getting me down as an arti…
ytc_Ugz2192N0…
G
@alex-rs6ts Yes because they actually have to walk, plan and learn how the camer…
ytr_Ugw42QxhP…
G
what could go wrong?? 😂... humanity caused an accident or a deliberate malicious…
ytc_UgyF99Nur…
G
So basically the Planet will become unliveable because of AI. And then the Rich …
ytc_UgytauU6M…
G
I do think that there is a real possibility that AGI and super-intelligence can …
ytc_UgzaJfH6T…
Comment
The First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
The Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
The Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law
youtube
AI Harm Incident
2024-01-05T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugy07hyGB0pJod__Md94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyzvq7GtdSMeWXd1xJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5alMLIOPe5sAN3V94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwn5LwEsORtsedjd9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzEmYc7AJ86YGzAgKN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyEQ-E6fJNRN5s54kp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHj6hv-qGE5sE3d614AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0eGi6eyfGud8T5Bl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz34DIBmjm_14lY8ht4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx6mH1UC7UzStQQMxB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]