Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In Isaac Asimov's writings about robots in the days before the Dartmouth confere…
ytc_Ugxcq1MXV…
G
But AI art is literally just about creating accessibility to create art for a la…
ytr_UgylPNtt_…
G
Dude I hate AI so much , it took me so long to learn how to draw, just to have …
ytc_UgwoNsGnz…
G
As neither a legacy artist nor an ai artists strawmen like this video have certa…
ytc_UgxJuw6W-…
G
AI Hallucination is not about intent. The LLM is not being duplicitous. The prob…
ytc_Ugyrzk2cQ…
G
He is missing the point, it will actually be beneficial to humanity to have a mo…
ytc_UgxEPSK31…
G
We cannot perpetuate the idea that these CEO's are geniuses or savants. It discr…
ytc_UgwH6PmVX…
G
If any road vehicles should be autonomous, it is semi trucks. But ideally, all t…
ytc_Ugz9w-xVe…
Comment
Very well said. Unfortunately at least 50% of the people of your own country would rather see a billionaire and his cronies get richer at the expense of THEIR OWN well being as long as it fucks with Liberals like yourself. As much as I admire the message, the poilical left in America and all over the world let the demons in the door while they were bickering with each other. Its too late now. AI should have been legislated for years ago. Social Media should have been legislated for long before that. We're going to all get exactly what we deserve as a species. And by the time we realise we werent enemies, we were all we had from the start, it will be far far too late.
youtube
AI Jobs
2025-10-09T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzEjlrD2QpJlWr-I9J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1fF7Vcyk7bdMP2yh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxAA3Vg6nEEqm0RLPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwIVhqyYCTfoJHG_AF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWN3T6NMu0kwIDfsZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx74gvQa3SUYtrsSVx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKjvFNI0t00Dk43_Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzK7zQQoADnIgubOIR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxCFVWTGNFXRiCqXI54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyfw_ZAVguFzNOu2rV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]