Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI mimicks human behavior, so no doubt ai will replicate evil behavior given eno…
ytc_UgxCxpPrS…
G
If the AI is Advance Enough, It will think all creation including humans is a re…
ytc_UgzNXORbD…
G
I thought it was very clear that there was a difference between drawing somethin…
ytc_Ugy8SEzAc…
G
The idea of using AI in interrogations doesn't make me happy at all. Remember po…
ytc_Ugw9JL3Bo…
G
It is not US vs China. See the light. Ask what AI can do for you.…
ytc_Ugz8sSUz3…
G
So the idea of cutting labor to save money falls apart once all other companies …
ytc_Ugxo5hmYR…
G
Yeah that cop wasn’t expecting to see that waymo car driving by itself with no o…
ytc_Ugxh4juSU…
G
We forget one thing in this whole discussion about AI and this is the following:…
ytc_UgybwcRhr…
Comment
“The more compute you add, the smarter it gets” — oversimplifies a messy reality. Sure, scaling compute and data can improve performance, but it’s not linear, guaranteed, or equivalent to “intelligence.” There are diminishing returns, scaling laws, and architectural bottlenecks.
“Alien intelligence” — that’s just drama. Models are statistical pattern machines shaped by human-made data and architectures, not mysterious entities with alien goals. They don’t “wake up” just because you plug in more GPUs.
What he’s doing is packaging uncertainty and complexity into a digestible doomsday pitch. It’s an easy way to grab attention (“only 5 jobs left by 2030!”), but it rides on exaggerating AI as magical rather than mechanical.
youtube
AI Governance
2025-09-04T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxrBrXu90G8WfDkW6F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMFip-m-SoPj9IJvV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2N4j3Fm1fMaQG3Xp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_HgtOjilI4uzrC8R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyd9LnV5jNUb0nUhxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIpY-WIkb63NFZl014AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzd_Xy3wIPYplKSbOl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxhxt0GTSCV8_dYfjp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzn1HTUGw_VQuQ3msB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyifh9v-gQEUP6V2wd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]