Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@harrietjameson exactly, like if the government approves this kind of usage on a…
ytr_UgxTw9SKz…
G
Though ai art is nothing wrong.
Some people using it really are too comfortabl…
ytc_UgzXGxbG5…
G
That way that the second robot death stares at the camera 😂😂😂 I think that robot…
ytc_UgwK_qTZB…
G
if nobody has a job and money... who is buying stuff?
Having AI workers but no p…
ytc_UgxNWKdDg…
G
Rich people are so excited over AI, that they’re fantasizing capabilities and th…
rdc_m54z54i
G
I take the self driving taxi car to my appointments I love it because it’s not T…
ytc_UgzQ7Y-wH…
G
Tower of Babel was the foreshadowing of AI.
Repent, and turn to Jesus Christ AS…
ytc_UgxLmXR6m…
G
but bro they spent so much time working out the perfect prompts bro!! they even …
ytr_Ugzayvuht…
Comment
The first thing that has to be done in order for people to better understand AI is to _change the name._ It's not intelligent. It can't combine known data to come up with _new_ data. That's the first mark of intelligent, taking what is known and extrapolating a new idea. Machines can't do that, have _never_ been able to do that, and never _will_ do that. Why? Because that requires sentience and for something to be sentient there needs to be an element beyond the physical machinery, a non-corporeal entity. And since science doesn't accept the existence of such things (because they can't be measured and therefore couldn't possibly exist) how could scientists ever create one, let alone instill it into a machine?
youtube
AI Responsibility
2024-04-15T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxXn_kY2d7kqZubVll4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzvJT3vrHttOmz_arJ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzfAhS5tgssCm36tYx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxatmtSCsHyNBf8ac54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-4lAkl7avXSX-_iB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzL8EJRwy-sNQEQwIJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy85qIhCHfB9_u8etV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyi8N_Vb4Kt1JXZFMt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwOJul1_m4VxbnlICd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxgtZnpF3ctHDVabyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]