Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was expecting man to pull out his phone and try to use AI for oral argument😂…
ytc_UgwV-aaQq…
G
This is a complete waste of time. ChatGPT is designed by humans to simulate huma…
ytc_Ugws4qgi4…
G
If everyone loses their jobs, how does government fund itself? No more income t…
ytc_Ugzl5zFEI…
G
Momma says to be polite…
I guess I’m polite to my AI just because mom says so …
…
ytc_Ugw2toj5H…
G
I'm pretty certain that if Perlnutter was requested to have AI sterilize boys an…
ytc_UgxMCQ84g…
G
And an AI needs a data centre the size of New York that uses as much fresh water…
ytr_UgyS0ICsN…
G
And while learning about copyright, make sure you learn about fair use and trans…
ytr_UgxTbAwSp…
G
God this guy isn't that smart. His speech sucks. But he made the bomb. Maybe he'…
ytc_UgyatONOO…
Comment
This assumes it gets to the point of legitimate conciousness and/or autonomy.
The reason it causes issues with humans is because we are consious creatures. Not showing empathy to a mailbox after it got hit by a car never hurt anyone. But not being careful that there aren't any jagged pieces of metal on the mailbox after you fix it up definitely has. If AI does become conscious or convincingly consious imitating humans then that would become a problem. However, expecting empathy is a human characteristic that an AI might not need. An "AI" language model doesnt have neurochemicals to make them happy or sad or angry. An AI wouldnt be able to feel upset over it's place in the world unless we simulated human feelings in the program.
I guess all that is still far in the future though. (Hopefully)
youtube
AI Moral Status
2025-05-24T08:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyQfHtmGnw9kL0sbsp4AaABAg.AJD-vAKVA4QAK9VKOEiqK3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzukWbOyPucVjtXflF4AaABAg.AIsHOhdrxkCAIuhUOKrKDK","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzukWbOyPucVjtXflF4AaABAg.AIsHOhdrxkCALHm3MZo0a6","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy3anUeUrp_s4BhaC14AaABAg.AITl2pKBnX5AIVOD_vZrgS","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxxOagt1Ac8e_16kkJ4AaABAg.AIMGvoiATsdAIVPnx7KVa0","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzSb3hJasAKZuIRV354AaABAg.AIJq_0Kg-2CAIVQHEVQ3K3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwucDHFz5TMSdF_Vhp4AaABAg.AFplo8jx11AAFq9HiRrfpd","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzGaRixydj-yPm3W2t4AaABAg.AFc7wU956kcAFoUzqCGFqx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwLZpRcgfvKUtEaEp54AaABAg.AFc58pCec5BAFc6atY-xTe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx4k244ZcBlF1kJ9fp4AaABAg.ADpKrXtIKVGADqW9NNhZeT","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]