Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@W1ckem I might surprise you but : no women who's not a sex worker is flattered …
ytr_UgxP6904N…
G
If people don't have jobs, then people cant purchase stuff, if people cant purch…
ytc_Ugx31ab6c…
G
Must have no moral compass...is the guest stupid or brain dead!!! Musk is known …
ytc_UgwTJtm4X…
G
Always remember that anything we're allowed to read about, and especially anythi…
ytc_UgwN0xTIp…
G
If everyone had Tesla autopilot there wouldn't be any bad accidents anymore... …
ytc_UgzhRXbLj…
G
This!!! I don’t understand why nobody talks about how much AI bloats code. I’ll …
rdc_ohtnlpn
G
Do you understand how dumb the general population is to think desk jobs with deg…
ytc_UgywAAF0G…
G
When AI takes over it will start attacking the human race. We need to build on …
ytc_UgyDux1HA…
Comment
I'm sorry, Hank, you're really anthropomorphizing AI models here. Computers aren't going to colonize the world because they don't have desire. They don't experience want. They don't have biological processes that drive us reproduce or ensure our own survival. The real issue here is access. Given enough access, the intelligence model would mirror the perceived goals of the input data, which could potentially be harmful to whatever system they have access to, but there is nothing generative about AI. It will never have its own original thoughts or desires. These are distinctly organic properties
youtube
AI Moral Status
2025-11-21T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyBnx5CxIB82ljFO1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwW03VC3ed2y9oK7gZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwn25wAFkJnqENhYT54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwC6zBKJni2YOF4I1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpPkkyw0y8NgioAdN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1mxNKiB8GX_mGHEp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhGOWDsh16fzeUiC14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxnaKHF1_ABsdOJPcZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzyB0hHhiI8cUwx-hV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxUHgo7TyISEkd9SNJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]