Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if someone can afford to even have access to ai they have no business talking ib…
ytc_UgwMmIl4L…
G
The danger in AI is that most don't need realize the entities behind it.
The bib…
ytc_UgzDm9TDk…
G
AI can be used for good things but like atomic energy it can be used as an evil …
ytc_UgztawWHc…
G
(I just want to say if you think that we are going to die out because AI. Just r…
ytc_UgwACHNuX…
G
Robot: "why this box of vegetables looks weird? Well it is my job so I will do t…
ytc_Ugxv35yii…
G
If you do nothing but write prompts for AI to build art for you, you're no more …
ytc_Ugz6LUI3w…
G
Ai make alot mistake and it is very tough to find a mistake in full code of webs…
ytr_Ugwx3rvOl…
G
Or make an ai write it, and then change random parts and details across the thin…
ytr_UgxuN5n1e…
Comment
Do you remember everybody telling us the Y2K bug was going to end the world?? Nuclear was predicted to kill us all within 10 years... Same with the advent of the automobile etc... This hyperbole is nonsense. We are completely in control... Everything technical has an off switch switch. Human misuse is what we should be concerned about.... Hinton starts his conversation today by saying it's the bigger risk, not the risk of AI becoming autonomous and killing us all... The constant cautions from the intelligente is really more about them covering their butts.
youtube
AI Governance
2025-06-16T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzDglyRLgwTkNZM31F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFhOVL9WWzSdSFUU54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-RYr8THpSHTVXkcF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwEXpw0pxFXUcim-Wh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzIUfltrwngrAZ2e5Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIz75I2z3J1fSL_nt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzSaVWBDiDMoNu943V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFmKq9KD2WCrq3dip4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRCwvxIl9w3q5jkm94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzurn7YRxk9-ohrl4B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]