Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I absolutely hate ai "artists" it is not talent, u r not special bc u put a prom…
ytc_Ugyh1vqXc…
G
the whole point in engaging with art is human connection
generative ai is actua…
ytc_UgzSCrn_N…
G
If you want to develop your body ,you will have to consume " very healthy" , …
ytc_Ugznl6LVv…
G
Butter-passing robot
B-Mo
Omnic monks
Brainiac (maybe?)
Hal 9000
And absolutely…
ytc_Ugja24tjk…
G
s going to be a major serious problem that AI is going to force on us as they ta…
ytc_UgwO5cHH3…
G
I predict that with the rise of technology, there will be a leap in human cognit…
ytc_Ugw1_xAxk…
G
some opposition parties, rights groups and legal activists have said he should b…
rdc_jrzv8h9
G
AI and LLMs were built to "play roles" being polite and adding a thank you at th…
ytc_UgwM_ylu2…
Comment
I don't think AI will work like that, a computer cannot just expand infinitely on itself. Your forgetting the AI hallucinations, when an AI overextends it messes up without realizing its messing up. Even super intelligent beings will have to have.... hmm... their "minds" contained, or they will face what we feel as madness.
This is just my suspicion but I do think we will find out true intelligence is not possible without a physical form that gates the amount of information a AI can take in, like the context parameters in current LLMs.
youtube
AI Governance
2024-06-02T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzuyfz_sQ-pymJOWXV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-LYQOhZLY-hdBXWh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7cIft2gE3J3Slugx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6DwkKvjjXUWzvCMV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz5aPDaQIkNhxGUyLt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy7V5BOlgCnZG2NDqZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPw4nMxedrI_7yZhp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwYAOlahszwq7NQa1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgySPEuGE3PKxNBPMaZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwqVnNx2LNMYYQwYhx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]