Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Liberalisation of India
The internet boom 💥
The 2k bug 🪲
Information technolo…
ytc_UgxDVFHRx…
G
John is obviously a robot, no being with anything resembling a soul would eat ce…
ytc_Uggxw68cA…
G
That girl with the dog was definitely AI you can tell by looking at the eyes…
ytc_UgzDIeF-3…
G
What makes art art is the soul in it. The passion, the creativity, the humanity.…
ytc_UgyjPyzrR…
G
damn that first clip is so fucking stiff and lifeless. If it wasnt AI I still wo…
ytc_UgxAVov1X…
G
remember that they cant make ai art without actual art to copy and make an abomi…
ytc_UgxuwkEh1…
G
The first job for AI to take over in every large corporation is obviously the CE…
ytc_Ugywun-_c…
G
"Sir, you are actively building the AI Skynet that will try to destroy all of hu…
rdc_n0h8hsc
Comment
While I agree with those being legitimate and scary concerns, autonomous robots will not just remove morality from the equation. They will also remove emotion, greed, prejudice, and human error. They will follow protocol instead. If we install A.I. with a set of correct instructions on how to react in any given situation, then they will be the best trained officers this world has ever seen and will be incorruptible unless hacked. That will be the thing we need to defend against once a proper set of protocols and parameters are installed into the A.I.'s thought construct. I also believe they will negate the need for lethal interjection in the line of duty. They will not be "alive" so they can take the risk while employing non-lethal measures in the field.
youtube
2015-07-30T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgiG1VbD93Hl9ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjmk0vQ39_GpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjlP3MMVlkjBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UggD7tYfVbQtU3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughm6vEeTLi9RXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugiw0vwfohKCq3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjrD7whMK2ahXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjXz5wvV6sOe3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UggRoh03TKwiPXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggS5-aiz9SI73gCoAEC","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"approval"}
]