Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They are trying to pet these attorneys. They will be replaced. The only entity t…
ytc_Ugx1C61kN…
G
_"If a human can do this, then so can an AI."_
AI learning and Machine learning…
ytr_Ugzj5iEhX…
G
@KagePoker if your image made up a vast part of the finished image then it's not…
ytr_UgxPK_doe…
G
Ironically the self driving cars are actually manned from continents away.... no…
ytc_UgzitYg0Z…
G
AI art looks cool and doesn't cost money to make. I don't see any issue with it …
ytc_Ugz2Eyhsn…
G
Oh boy NaNo has been a wild ride in the last year for a bunch of reasons.
Know…
ytc_UgwOKuA4Y…
G
Also, most AI have a propensity for blackmail. And almost all of them can be man…
ytc_Ugwl7Wrmz…
G
I disagree that pedestrians should alter their behaviour to accommodate autonomo…
ytc_UgxdjWT7O…
Comment
The volume of code is not equivalent to the value of code. More code isn't better.
To be fair a mid level engineer isn't as useful anymore. An expensive high level engineer earnings $300k is a better value with AI than 10 mid level engineers earning $30k. The super stars are pulling ahead.
youtube
2026-03-20T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxfuTVrmkzts7Vex2F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyJ4KwBq7IK2PL81-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy7aIeKtBd0ZzHqEtt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxww4hIrgp4xjLdcmd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2MFLjl9xTKSOPRfh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw32c-TVev6eaGdM1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz0m52AKGvZ7lszX9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxff0M59HVDr1BY8cJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxDNe5W14AavbjOKst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwBzqrErvA03yl_UKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]