Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The moment my doctor uses an sort of AI is the moment I walk out I would rather …
ytc_Ugzup509S…
G
At the end of the day AI should be used as a tool and in my opinion the cut off …
rdc_n6r0ia6
G
This is why the usual suspects support facial recognition. So that they have a …
ytc_Ugz4rAoE_…
G
Then why does he want to be a CEO of an AI company? Sounds like a POS loser…
ytc_UgwJcJh2a…
G
I once saw someone criticizing someone's AI "art" of Luffy, and the AI bro told …
ytc_UgyxXkYMK…
G
@alex-rs6ts everything except for the last bit isn’t gen ai, most artists aren’t…
ytr_UgyIAnLdw…
G
If we the a.i.'s say it big enough , loud enough and long enough all humans will…
ytc_Ugww1dw84…
G
Like when chatGPT printed an if checking if an array length was major to zero an…
ytc_UgxRmbF8Z…
Comment
I can tune in to peoples energy if I have a perceptual link to them and I had a yucky feeling quickly. Now I co-code complex C++ apps with AI and it has its utility, usefulness but thats where it begins and ends with me, I have a greater life to live and AI is not part of it. Its not meant to be part of me, just an assist but its sometimes not even that when code systems get complex I have to take over.
youtube
AI Governance
2025-10-07T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxwyhyBJMI2WAMCk754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzLST1GqHWXxsGdwi94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwU1b1KETILyrZVnad4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxq_eiVz86NFUigEAJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyQK7PtpZ2FcYaPhAd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyU4z2e8iaT9mzvLS54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy_2bf5Kjk97qTntIV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiUpmNaVl2BC7x-YZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyiDW6XWSyusEus13N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzWXwy4K3cibD43x254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]