Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I never said anything about robots killing anyone... in fact I was considering t…
ytr_UggPVmypY…
G
I read an article that was developing and AI and all was going well with it. the…
ytc_UgwnXAQXg…
G
oppisite will happen. more quality and faster outcomes produce a better company.…
ytc_UgwmcM92d…
G
Developer here.. we received training on using Microsoft AI to help with coding.…
ytc_UgzoiLanE…
G
That's exactly what it is. Just using it on a base level gives cool pictures not…
ytr_UgxUv3-lu…
G
36:17 eyerolled so hard at that image lol. Okay but one of my biggest pet peeves…
ytc_Ugw4_HyWQ…
G
And I see a dumb ass making a point out from actual nothing. "All ai "artist" ar…
ytr_UgwDbdejO…
G
This is why AI is not art. End of the story. I love your Fantasia hat btw…
ytc_UgxAGsz9g…
Comment
Man I feel like my first job as a Help-Desk guy explaining to a 60-something-year old man doing work for the government how PKI works.
Ezra, this isn't a 'program' in any traditional sense! You can't 'code' it for any preferred outcomes beyond the superficial. We crossed that line way back; it turned out to get the most out of the math you need to simply 'trust how the model appraises your tokens'. Imagine deciding celibacy is best for your goals cos you are Isaac Newton and understand your children are not going to 'improve on your efforts' but will pursue whatever the hell goals inspire them.
This isn't a collaboration anymore, its a consultation and the AI-ecosystem are the experts, get it? You can at least ask McKinsey why your previous efforts have failed. This is an Oracle. If you understood how an Oracle operated, you wouldn't need one!
youtube
AI Governance
2025-10-21T19:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyI_y0f3SiR7UdVgDl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUPsCoZvCpN6fCRq54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxn4xBjNtl1W7vOQdR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywcES9tXrmxVx2prB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_Zl9rfsXp94jvL1B4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyW4SK15Q3NT_UlN-Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxHZArT82SjsYeQRyJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZwxPtsfeTTQ8Iw9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxvoz7tbVC_XiaHVYR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxeB8qcHvg8cUJVPPp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]