Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stop calling ai guilty! The real guilty person is the one using it to steal art …
ytc_Ugw7_V3vp…
G
Have people brought up the idea that the purpose of these AIs isn’t to become ne…
ytc_Ugxfrb4cv…
G
Facial recognition is not enough. It is so amateur. Now detectives will blame th…
ytc_UgxK7Q-_D…
G
This is why I just don't understand ask something pertinent how do you feel aliv…
ytc_UgyXKGrFl…
G
Robotaxi doesn’t need to do the drive faster especially if it’s less expensive (…
ytc_Ugy0duFYp…
G
Anybody can build AI now!!! just buy a decent PC with descent graphic card and m…
ytc_UgwC_PrKi…
G
Its not ai that has to be blamed for anything! Everything is result of our own m…
ytc_UgzS3S03K…
G
“They” are all AI already
We are either going to assimilate
“Take the number…
ytc_UgzSqG7fW…
Comment
The less involvement of the human, the better
AI will not make the mistake of a human (or at least make much less than human), like a false alarm of launching a nuclear missile or such, which almost happens a few times already, just go google it, the reduction of human error would benefit everyone, Self-driving car is a great example, they are not perfect but already much better than human, and that should be good enough, that said AI is still not at the point where it could be without any human involvement yet ;/
youtube
2018-05-09T13:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzcCwCegUbtvngQPJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGa0-tnosDqblNKKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9dZ_JKNLYmU8ndnp4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzJN32BfiHsOID-bad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyCPC1G6daYnx_uaaV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuBtrBfLrj0IdLust4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5YVOYSxB0W7uyJYh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwsy769faCmWMqna-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypyowmCCN501wSRRZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzhzWKI7aW7yWeIdhd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}
]