Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if an ai picks up a fucking pencil and tries to actually draw itself, well done,…
ytc_Ugzm37hQC…
G
Love your Art. AI can go to hell. The rage I feel when I see AI created photos o…
ytc_UgxMsRrCa…
G
I have an idea. The possible answer. It is possibly the most important thing tha…
ytc_UgwwyUnRP…
G
HELP- 😭😭 one time I kicked an ai in the balls then I bribed him to be my friend …
ytr_UgwphuGeO…
G
GDB has been the kind that moved fast and broke things, caring not about the con…
ytc_UgzTXZ-n1…
G
honestly AS A TECHNOLOGY ai is super duper cool. like we made a machine that can…
ytc_UgyDNtE_v…
G
@artistsanomalous7369 bro Ai makes images via numbers Ai doesn’t know anything a…
ytr_Ugzyzl_ii…
G
Until they can truly understand what creates human consciousness.. they cannot p…
ytc_UgxQeCF_c…
Comment
am I right that I think AI will act just like we act but faster and more complex?
we create AI to serve our goals by feeding it with all date that exists in universe and because we are too slow to analyze huge data we let AI to do it.
good and bad exist. However, there is end. it's meaningless that human consciousness will keep on transferring forever through stages with no other goal but to keep on surviving. for what when there is nothing and nowhere to do or to go. Life must shut down then.
youtube
AI Governance
2022-10-12T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgweccqBr_04Agxp-Yh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxnDFwfxL2QmOVSvhp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwIu-kQXDdJdIlSHtR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxHRscAkmom7dCMJRV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgyDiohFY0TX7-l0rn54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_Ugxnm3jumjXIS0tx4N94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxMtgBNh2ztgYYT3G94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugxmsisks6g3A5gYAHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugwb_rFYLicR-fRslqx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugxn2-WiVqnkf2d9lNF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}]