Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He says we dont know how AI works ! Thats strange , those who created AI dont kn…
ytc_UgyzWI-md…
G
Idk thats already kinda what fox news was. And let's just be real for a second h…
ytc_UgwVh35bz…
G
I think the problem is we keep calling them A.i's, they aren't Artificial intell…
ytc_UgyZiTv5h…
G
I’m sorry but why would AI try to take over the world? The real danger is if our…
ytc_UgwI_7_j0…
G
Right as the new bill is going through that will ban states from regulating AI w…
ytc_UgzV5FMrO…
G
@LordofdeLoquendo im just saying that AI art is for lazy people who don't belie…
ytr_UgwIRPEVv…
G
What you call fintech is just part of standard banking in lots of countries. No …
ytc_UgwyQMGo0…
G
I think photoshop should have its own version of nightshade you can automaticall…
ytc_UgzIlMIP0…
Comment
Worked for GM in the late 2010s, they kept telling us they'd have full self driving cars ready for purchase by 2019. Everyone thought I was a fool when I kept telling them there was no possible way us or anyone would have them other than in very limited and restricted areas, and definitely not for consumer purchase anytime close to 2019.
They would literally laugh at me and tell me I'm and tell me how close we were. I'm like look at the current level of consumer tech theres no possible way. Your phone can't even do what it's supposed to without glitches at times. A self driving car glitch means someone dies.
Until/unless we have actual AI that can think at least as good as a person and react to real world situations as good as a person, we will never have it. We'll get 99% of the way, but that last 1% is a huge deal. Even if they're safer then people driven cars, people will not tolerate a robot causing deaths. They need to be 100% reliable.
youtube
2025-07-30T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgweanYgi1cMkl3kXXd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8s8wfF715qAxELwV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwSYprJ1vU9YDlWsad4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzaK1g-0gLPm_aP8iB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLYWZOn5p4Rb5Gn954AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz_DZ-PxXbQw6TxiOF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxbApOyaC1noq2hm-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVWVL5uU1qX6k0GTZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2qhCzYuHBIlpHYkV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4sh8eeWAbooWilSl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]