Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Victor Frankenstein didn’t think of what might happen if he actually achieved hi…
ytc_UgwH29tnB…
G
Denying a sentient being rights because of no flimsier a reason than "We made it…
ytc_UghzYjspM…
G
Him: CHILL CHILL PLEASE I WANNA GET PAID!
Robot: F!CK YOU IM TIRED OF THIS SH!T …
ytc_UgxWMZ9e9…
G
Ai books bug the HECK out of me.
You're telling me I've been honing my writing …
ytc_UgxYX9rxz…
G
The background music on this video sounds like AI tried to recreate the Valhiem …
ytc_UgyY5wgU_…
G
“Corporate moral agency and AI” in IJSODIT predicts “artificial ethics” where AI…
ytc_Ugxx-0FhP…
G
What's of open source like liama 3 8 b parameters just run it on your strong sys…
ytc_Ugw0J7t2b…
G
Addendum: Soooo many people love to strawman things like what I just said by imp…
ytr_Ugy-9l3p4…
Comment
I won't say it will never happen, but I think it's highly unlikely AI can truly surpass humanity. AI is only as good as its creator(s). Perhaps it will amass more knowledge, more capabilities, maybe destroy us, but AI will only ever be a flawed reflection of us. And we are already very flawed.
I'm also reminded of "so preoccupied with whether or not they could that they didn't stop to think whether or not they should".
youtube
2023-05-19T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-RdCLmstZUklUIgB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwT09jbUuB90Jm7bc94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugztrm6zXamP4RfeNJp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxtB6XDcRLgKxjGkZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxx3_AdJ4Pe6rz6GCp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxgIZNkE3jWSAlbQqt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJRTrIZS81BwxrQTJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwX4wrGXSpdOl5gmg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwKFdmsllRavxhUM154AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwT2coWjOQJhyCxuPN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]