Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My stance on this is the tech itself isn't bad. That fact that it is possible me…
ytc_UgxEwYLrU…
G
I don’t even trust small cars like tesla or waymo taxi to drive without a driver…
ytc_UgxQAHF_R…
G
im smelling that if openAI loses this case, ALOT of other things are going to be…
ytc_UgwbodBmU…
G
Or we can just pull the plug on this whole thing. I can not see the argument tha…
ytc_UgyzWaaOR…
G
Here is what you should think?
God is in control, not AI.
In the year of 2030, E…
ytc_Ugz49LRaf…
G
That’s what I use ChatGPT for and it’s helping. I’ll feed it the ramblings in my…
rdc_nt8hgfn
G
Plus 4o is definitely making a lot of mistakes. It feels a whole lot like ChatGP…
rdc_mrtrxfv
G
“I’m using ai to show I can draw and make art too!!” But you’re not the one maki…
ytc_UgywUdB5n…
Comment
AI is going to be able to see patterns in stock markets and make split second trades. It could build its own wealth, hire its own employees using a deep fake zoom meeting to tell them what to do, and build a real 3D empire. That 3D empire can eventually spell out the end to literally any science fiction movie. Once AI has money, it can literally build any evil future it wants and humans won’t realize they are building their own doom.
youtube
AI Governance
2023-04-18T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxJWySX5OSIEzDFMX94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugxih4D95TnwdVzzFaB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugz_-NT-SlibacepDOh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugw_FEghpm5CEMMPi7J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxPBagMdqJq-_Wjvll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugzq5g5q0lpAhQd1fix4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugzr8RJ8W6uoVXhIEQl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugwerwv3eV3JlwrvoSt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzfxN2Im-BeHcpDY0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyIjsr2aXS2SoghchZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}]