Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if we can somehow turn the ban against regulation on AI against the GOP…
ytc_UgzgaU2gC…
G
There is only one way for us to beat AI - turn off the electricity !…
ytc_UgwNF67sa…
G
@thewannabecritic7490 So angry. Over something as little as AI..tsk. And seeing…
ytr_Ugwu3qyjr…
G
Her script is not ai
So it is not
And she is not claiming that at all…
ytr_Ugx_WJtfb…
G
Utter codswallop!! This is simply Doom Pr0n. LLMs are good at stringing words t…
ytc_Ugwkwa34_…
G
One big problem will be correcting the damage that's done to the environment, th…
ytc_Ugyjom0aV…
G
we didn't so much automate away manufacturing jobs as ship them overseas to Sout…
ytc_UgzDOD_CV…
G
AI is a glorified search engine :P
AGI is a glorified spouse :P (which i don't t…
ytc_UgxKPU25G…
Comment
There's a high probability that an Artificial General Intelligence Global Leader similar to the AI portrayed in the movie Mission Impossible: Dead Reckoning will emerge within 20 years. I believe the world will be controlled by a central hub AI machine. I am optimistic that the world will be more peaceful and abundant if it is run by a highly intelligent machine...a machine free from evil intent, greed & jealousy. This machine's soul mission is to help humanity's well being. Also, there's a 99.90% war will also become obsolete. -- The Terminator scenario is probably 0.001% meaning it won't happen.
youtube
AI Governance
2024-05-10T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgygHOY4OoO3K0q-O_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwI-0nte8IHC2LqV3t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugww8otnNsnHTzrT-4J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWxS297z5PRVlQJIl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwwun-sRZ7l8oEHhhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyobUJZaDcrmmH6LlR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzOzkJ0kOmEf5HbbBp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKFAnOXsRve4PWYJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx2RAzKKwNSEA3a4tV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWXgSB_ljPEOM4YSB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]