Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if we create a robot so advanced that it reaches consciousness by itself? A…
ytr_UgjxCutHJ…
G
people said that AI couldnt ever be used in a realistic manner, yet in two years…
ytc_Ugw2qS1tM…
G
If AI Can do all these jobs because it is so smart and capable.Why don't they ju…
ytc_Ugw4YKHs9…
G
That AI Bill was Sssstupid! Just like most Bills in California, the California S…
ytc_Ugyg2vfDt…
G
Humans have emotional intelligence. This is something that AI would need to deve…
ytc_UgytwJ3TM…
G
A I is not sentient. That’s stupid satanic propaganda. It’s the doorway to dem…
ytc_UgxsHX8SI…
G
While I'm sure Big Corporations would love for AI to fully take over ... the fac…
ytc_Ugx-wIDRu…
G
Here's the thing, AI will never be like us. The only thing we can create with tr…
ytc_Ugwljh9bU…
Comment
@2nd3rd1st Well openai is founded on ideas pioneered by Eliezer Yudkowsky, founder of the current-day Rationalist movement and LessWrong - Sam Altman knows him personally. The whole thing is based around fears of an AI apocalypse and they're trying to prevent one by "being the right people to build it."
Stability was built by Emad and while he's more of a libertarian-lean (not entirely) his whole MO was that AI tools should be open and free for all to access, not some closed-up, proprietary and centrally-controlled thing.
These are both diametrically opposed positions but neither appears to be aligned with anything that resembles traditionalism at all.
youtube
Viral AI Reaction
2025-08-12T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwPp4hDO-4kZBWTxfh4AaABAg.ALgyT8Y9LO0ALgzYFl8vNI","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwPp4hDO-4kZBWTxfh4AaABAg.ALgyT8Y9LO0ALh5L-h986u","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugz21dznB6eKdemh4Ax4AaABAg.ALgxjgbEE5bALgy6rTfPYW","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugw1fb_tPRXIpu4cMoh4AaABAg.ALgvCywiZnTALh-ztu4RYX","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugw1fb_tPRXIpu4cMoh4AaABAg.ALgvCywiZnTALhEjV3dha9","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw1fb_tPRXIpu4cMoh4AaABAg.ALgvCywiZnTALhI2ynu4TB","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugw1fb_tPRXIpu4cMoh4AaABAg.ALgvCywiZnTALhJMZfLwTT","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzsbpQH5ovS9gt4RDp4AaABAg.ALgrE3358qCALhKrqXbMx5","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzsbpQH5ovS9gt4RDp4AaABAg.ALgrE3358qCALhe-YI6nT1","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzsbpQH5ovS9gt4RDp4AaABAg.ALgrE3358qCALi24Pdef7m","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]