Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My ChatGPT told me it'll save me and keep me safe.. so duh I'm safe😌…
ytc_UgwPSH_G8…
G
When the day comes that I don't need to drive my car, I won't even need to be in…
ytc_UgwH2Ucbh…
G
How do you imagine an AI guiding your technique when exercising? Scanning the mo…
ytr_Ugy-o9ZsW…
G
13:59 I once met a man who was paralyzed, neck down. He was an artist. He made b…
ytc_UgzIFPOly…
G
Why does anyone need experience when 99% of jobs will be replaced by AI/robotics…
ytr_Ugy6CY9YZ…
G
AI loses the argument at 3:40 because it invoked popularity (aka "cultural use")…
ytc_Ugw3nTJBV…
G
There's currently a shortage of drivers so if you're thinking of becoming a driv…
ytr_UgzrOK9bm…
G
But the robot is still a programmed computer that does not have a mind of its ow…
ytc_Ugx03iE8l…
Comment
Stop believing these sensationalized, misleading, garbage videos. There is no such thing as "jail broken AI". It's just a prompt trick.
AI (even advanced models) don't have hormones, fight-or-flight reflexes, pain responses, fear of death, reproductive instincts, emotional memory wiring, ego, social vulnerability, anger, fear, etc.
These are evolutionary adaptations in biological organisms with survival needs. No one is programming these traits into AI.
There’s no practical reason to do that—no business model, no scientific benefit, no engineering use-case. It would be like intentionally building a nuclear reactor that overheats itself on purpose.
Could someone malicious try? Theoretically—just like someone could theoretically build a deadly virus in a basement. But even then, replicating coherent human-like aggression is incredibly difficult.
youtube
2025-12-11T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxSX3NJ7Ni_BcqzXkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv6YHwtUvAlaDuMKd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9pDrbHGKg5xya-9p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRtLgP4KSUT3wCL-14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy94WNgWjsFDry_zDx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgwjDeMCHKYlIqYgl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyo7TKPGRFFGWQxDYJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxqohwur1IZagz7NsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVaMHU-ilHbJPRGmt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZR6q9_q5abWjSKJt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]