Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
yes that is ultimately how it would turn out, however the average layman has no …
ytr_UgyN04GH4…
G
Why is no one in the front seat, ready to take over the robot? This is a faked …
ytc_UgxqeoEhF…
G
The idea of targeting citizens is nothing new, but the level of technological so…
ytc_UgwxRVtTE…
G
So I work with software and automation. Currently AI is impressive and is slowly…
ytc_Ugwl4f_gk…
G
@thenovicewhispers I’m saying we can’t define life, I’m just scared of that conc…
ytr_UgxV9TCIe…
G
Robot 1:i ma do mAh work-
Robot 2:*throw the box of the hole*
Robot 1:WTH N…
ytc_Ugz2Z_J0d…
G
Problem is AI will be able to take over voting regulations and who lives or who …
ytc_UgxSXFXSl…
G
Also, not everyone has the resources to automate, so all the little famers will …
ytr_Ugzgi-llR…
Comment
Where did they find that professor, like wow is he stupid. I don't know who this professor of "Computer Science" thinks he is, but he knows nothing. AI isn't some program you just type into and it tries to do what you have typed. You have to program it to do a certain task, it cant just go do a task on its own that completely unprogrammed. So the whole give everyone a tumor to solve cancer would never happen. The programmers would literally have to program the option and prequities into the robot. This is just propaganda and fear mongering.
youtube
2020-01-28T21:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzB0hWsFrZCkHqYAXF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnEOAU8JpZ64qGrs14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyPxPXM5LLBrIvCvdt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHAG2eRj_MmPhkAbx4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzkkDtz_SMxzcO0Ml94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyttFx0rgxVHFiysf14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuZKgDqAeAY6WTA894AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyL7xH8D5Q96XRCACN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwvoRfzxk-72qJWiA54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpnozIsuwsKwHGPcJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]