Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Charlie is not wrong about business using ai art went to McDonalds yesterday the…
ytc_UgyGw__sr…
G
I'll bet that there is another county somewhere in this country that's using the…
ytc_UgxT7VFLJ…
G
the act of predicting can alter the future. so no worries, since most people say…
ytc_Ugw2N4j3F…
G
Imagine if AI actually is creating bots to soften this AI risk theory. We might …
ytc_UgwT8rmq4…
G
Nobody have any idea about what is the future of education and jobs with AI. The…
ytr_UgwEA73SO…
G
Realism is awesome, but I expect it too be used maliciously, but just in gene it…
ytc_UgwMoku5u…
G
The lawyers involved must be DISBARRED! They have a duty to COMPETENTLY represen…
ytc_UgzqoliZU…
G
Nothing to do with AI. It’s to do with Orange Man and his policies. Do you reall…
ytc_UgxwHsesd…
Comment
Meh. Eric Schmidt might have a vision, that doesn’t make it correct. Specific to programmers though: “What’s the language you program in… it doesn’t matter” ...sure, if all you have is a hammer, everything becomes a nail. But it’s not that simple or black and white in every case. Ten years of deep, hardened problem-solving in a specific low-level language shapes how you think, not just what you produce.
AI is trained on documented patterns and solutions. It can generalize impressively, but that doesn’t make long-earned, experiential expertise irrelevant or interchangeable, much like autonomous driving changes driving, without truly replacing drivers.
youtube
2026-02-10T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxOoIYnrlV9T19-hpl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypoEhceTLLOW9ji3N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5cDA3X3PoPpMBBx94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugz7Fd5He23XqCbGr7R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgPbH_SDIyvjlhTNF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugwf07kJ-EHQs0xyHIV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfZdgEUl-OYBujSkt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFCi-m_aO29V5FcX14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3tcBdSOw1eR8DaOp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwAgpIWNhFRDyiaJt54AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]