Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The personas are basically nothing more than prompt engineering. You save the p…
ytr_UgwLcqt2R…
G
If you asked this to ChatGTP, it would probably concur and give you three ways i…
rdc_jef2s8v
G
Fuck oath. But no, they decided (QLDers especially) to vote in the circus curren…
rdc_f1u6dsn
G
@LouLou-xz6bw You should look up how learning AI works! We don't even know how t…
ytr_Ugx5TwkKw…
G
Simply teach AI to solve all humain health problem , and live up to 200-500-or b…
ytc_Ugzl6JO1O…
G
this idea is absurd - people are different, some are smart others... are limited…
ytc_Ugzt7xvCB…
G
@oreillysc1that and the bribery from companies like microslop, meta, and AI. It…
ytr_UgzCY_-43…
G
Don’t be fooled,AI is not able to think for itself. No matter what anyone says, …
ytc_Ugygg38hW…
Comment
It like how would Stockfish vs Stockfish+Magnus C stock up.
And the answer is - the M C stockfish would eventually loose :)
There is no human input, even from the greatest of our minds - in this closed board game mind you, so... grain of salt there - which could help the machine.
The only thing a human would amount to in this very very narrow case(!) is a bottleneck.
That said - this is a "closed" game, and also funny enough, Stockfish isn't even a general purpose system.
However a general purpouse AI with the ability to use stockfish as a "tool"... would send Magnus home :)
youtube
Cross-Cultural
2025-10-19T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_2xw34sxI8Z9AyQx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxGQmsY-wXdPNh0MRt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIAsyVKc4xW5jl7Wh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8GNN39KsdgDwONvR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzplx76HkyQ8vqPiMN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzveyt6Z_5eq1H86Sp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxP3-A0hwK9apQsIC54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxn44ypGTsVFgJHPgx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz5AfEbEd0XSMTrHd14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugynk6CPUNNK1hyAUzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]