Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You cant call yourself an "artist" if you let a literal robot do your job for yo…
ytc_UgxU6IYCD…
G
There will never be another wiki leaks. Any video evidence will be dismissed as …
ytc_UgwleszF6…
G
16:31 The commercial stuff is so real. Every time I get an ad for AI I roll my …
ytc_UgyENLS7O…
G
Ai art won’t actually replace art made by humans because to actually innovate, y…
ytc_Ugzoi_bQY…
G
Nah this is not just on AI alone, anyone who actually took a STEM course would d…
ytc_Ugxc_OBhc…
G
The biggest issue here is safety. Having retired from 55 years on the road, ther…
ytc_UgzXzIV0b…
G
And indeed AI is racist as shown by Gemini, because the idiots programming it ar…
ytc_UgwZehG4P…
G
Totally agree on the robot tax at least and while it won't help citizens feel li…
ytc_Ugw3Eq_zx…
Comment
This theories about AI killing humans is bullshit. We humans have a survival instinct based on biology. As a result, we have a behavior pattern like group-playing, which also includes species preservation but also competition. All living beings have this behavior pattern in some way, yes also plants. It is not a result of consciousness and intelligence, but of evolution, existence (and death) is limited to organic matter. Why should an AI think and act in these categories? What would be the motivation to kill the humans, what goal should be achieved with this?
youtube
AI Moral Status
2025-04-28T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyXTyObjAANjXCxEgZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyARUgOLkV_qPTfkFF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxTvOlcbw0qn4LfvlR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWB7ADIvo0_kchDXR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFc0TbgEmit6t1UgR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrE6iPqqsV1ld19ex4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-t2c8gwo7glwJgtF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5rZthk32xyFH94T54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwFjxlUpJ7AwDLr3MV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy4FC0WT32aKm7Qp5F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]