Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
99% of the time you can tell if it’s an AI voice by how they say what they say. …
ytc_Ugyved6pG…
G
Once A.I. is more perfected there will be a high market for Avatar based digital…
ytc_UgyGWxtSr…
G
I don't think so ai will in this generation pass human intelligence. Human brain…
ytc_Ugzse8tfz…
G
I hope they create ai weapons literally out of spite of a bunch of idiots who wa…
ytc_Ugwqj1F3W…
G
Do we need to be super smart to know this AI is nothing good!! Look at this guy,…
ytc_Ugy2ZgQQv…
G
This is why I stopped commissioning art. There is no point anymore. No matter ho…
ytc_UgyurFMA4…
G
Basically we want to develop the smartest Ai to use It for our dumb human object…
ytc_Ugx6obTgU…
G
Even if in ten years most companies decided to just hire a couple of SEs and let…
ytc_Ugy0zzYz7…
Comment
I have a similar take to JBlow's on this in which all the uninteresting glue code can be stitched together with AI, not that AI is going to be the driver and the copilot.
I'm genuinely asking everyone for the autopilot analogy I'm gonna use here - what is the "landing" and "takeoff" part of code where manual intervention is absolutely necessary, no questions asked, and what kinds of software engineering can never be done better via AI?
youtube
2025-03-12T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxVYkxCyWbgxb_89A14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzEpnnHIVtmQqBfCBV4AaABAg","responsibility":"industry_self","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwtIIqK9mENV0vh7mh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxICJJdH07zOYDYyAJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCqu7jXsbOnACIrOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOVCb5QS591BfcpAd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzwMyBhSBcVFwM8uMx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxpqZn295w37eMjqGV4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRM7bRx_UAAYx5UFx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx-1OS6cp769s6HNZF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]