Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
None of what they showed me looked like it was actually produced with AI, meanin…
ytc_UgzZiqbN1…
G
I doubt Balaji abandoned his career and committed suicide because of copywrite i…
ytc_UgyUCzUMX…
G
Intuition and time with family. Two things are very human and not something AI …
ytc_UgztFH2N_…
G
I like AI art. However I do not think it replaces human artists. To me AI art is…
ytc_UgxRpN3ad…
G
Did anyone see ricks butter serving robot, the very first one you see in the vid…
ytc_Ugzgm5y7Z…
G
@Cr1kkit You can send badly illustrated AI art to the group chat for shits and …
ytr_UgxDsvyIE…
G
Seriously, theres already companies layingoff the corporate offices for this rea…
ytc_UgwVZ0Oep…
G
"And a machine is replicating the style...."
Isn't that what the AI is programme…
ytc_UgzH8mXNI…
Comment
The reason humans do bad in the world often relates to our emotions leading us down stray paths due ti some trauma we experienced somewhere in our lives (again trauma caused by emotions). Those things are built in humans though. They AREN’T built into artificial intelligence. Therefore no need to worry….non living robots don’t feel the need for power or control like humans do because they DON’T FEEL LIKE HUMANS DO.
This conversation is moot.
youtube
AI Governance
2023-04-19T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxISGgE5kvPEkoi4Yt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmYQYlz5HPgD2su-94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwA7ExE7k7HLEYiaO14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJdIzsxZsGY04EA6h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgysVfxdluXJbUsL9CR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXOBq9IGFbEd6i6Yx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugysb0ljACgPbxmf6fJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOaeIsSv_9Pi08n0N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLgaxmUJ8Jeg4k1sV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzi4Kwb6PBnzsyhjQF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"})