Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
as someone who already used delve probably more than I should im distraught that…
ytc_Ugyu1VlFZ…
G
Before all the stuff going on about ai stealing art. I wanted to start an art ch…
ytc_UgxdB9Pqe…
G
This moron doing the interview was straight up played by this AI , the robot ex…
ytc_UgwE9jX4q…
G
Is it just me or the annoying guy in the middle is the reason why the male robot…
ytc_UgyUYJD4j…
G
One giant hole in the AI replacement of most jobs is that those who have lost th…
ytc_UgwV82z_-…
G
Hey I'm a mentally disabled artist I haven't used art in a while but that's bec…
ytc_UgwzaMnZP…
G
Ai cannot develop cognitive abilities , if your prompts are super specific . Co…
ytr_UgzpuEnG-…
G
As a college student in my super senior year- I’m gonna use AI if your entire cl…
ytc_Ugy6-hnQZ…
Comment
55:06 We release models openly, becaause they are mostly harmless.
They can't design pandemics. They can't design at all. We don't have automated factories for an hypotetical skynet to build terminators. No autmoated biolabs where an hypotetical Wintermute could unleash a superbug.
Current models are just better CAD assist in some scenarios.
The current trend is that making a stronger model requires exponentially more resources. It's the opposite of the singularity scenario of a super intelligence becoming exponentially more intelligent in the same hardware.
It'll take several hardware and software revolutions to get to super intelligence, if possible at all.
youtube
AI Moral Status
2025-10-31T12:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzc3FoPlmUo13BjPY14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkjE5TvWv7DeFuViF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5PtLrX3BuN2PtF-54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyfUu7tZNYOzfxMjRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHLr-umR1_GpE6nKJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgymNPOjttGRoP6gWWl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0FpkSc1Ljjwgy7Ux4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwudaEM1sWDSMh8F8p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyE7nbis9oK0bLu-Wh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwt0ssXHCnyjjWW5Ql4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]