Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*I’m talking to Perplexity, now; Quantum Computers, what effect will They have o…
ytc_UgxmX9zeM…
G
What's the sense of being a robot if you cant even control recoil 😂😂😂 Even My h…
ytc_Ugy0PKdvj…
G
Gene Roddenberry Star Trek 1 we must merge with AI and technology we must become…
ytc_UgxZIXDC2…
G
@RedOneM Our brains are not machines, biological or otherwise. If you genuinely …
ytr_UgwTYJeGk…
G
@markupton1417Ah yes.... Ye Olde Human Obsolescence Denial logical fallacy. "A…
ytr_UgwZtlYjA…
G
This episode really made me stop and think. Dr. Roman’s point about how we might…
ytc_UgworqxWB…
G
No way I just got an add for an ai platform while watching this 💀 Also I agree w…
ytc_Ugw7TLzQI…
G
AI Cannot Be empathetic...it is NOT a livingbeing...it is GIGO. it is what we ma…
ytc_UgwEdma-z…
Comment
Altman is mesmerized by his own creation and he will be devoured by it. And with him, all of us. There should be strict regulations about the isolation of AI and forbidding the Super AI. So strict the jail time is not on the list. This is an existential issue for humanity. How would you punish those who ignore it? Yeah ! Exactly !
youtube
2025-12-26T21:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyhT2TveuKHcKptHpB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxIm6xdmQINTss--mZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGNPskq5k9Vd-Kx1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxukSy3Pyee2ndCkpN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyN1sG9V8Fjm1ToZl14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzgMKXNJY6N0FgDlvp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSSjOWXHlhISyJ63R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyreEcaLDUdr_DzTmt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyUXhcVPlb3CflDPoZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxEywbfTSasnjtmyFt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]