Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robots bc if u look were there ear should balong there is metal and u will hear …
ytc_Ugxae1iYL…
G
My conversation with Chatgpt when is acting up when I'm using it to work on my p…
ytc_Ugy2MP4fC…
G
@Donbon3to expand on this, i do not have the time or money to develop my art sk…
ytr_UgyH2JI87…
G
11:08 i think clip studio paint also wanted to try adding an AI generator to the…
ytc_Ugy4Kt69r…
G
I don’t really know much about AI or algorithms or any coding but i know what I’…
ytc_UgxFs9j7A…
G
8:55 I complete agree. I remember this phase and remember interviewing so many p…
ytc_UgxcqFgrO…
G
For all the ignorant clowns out there: when money is saved it’s spent somewhere …
ytc_UgwvPiN7n…
G
Everyone likes jumping on benches and throw their backs on their arms, no one si…
ytc_Ugyh2uVEs…
Comment
Living beings start learning experientially even prior to being born to some extent. AI can't really learn in that manner in the same sense that animals do through our physical senses.
What if AI becomes smart enough and has it's automated infrastructure, and rather than it using energy where it creates more carbon emissions and in turn raises the earth's temperature, it works to intentionally lower the earth's temperature because electronics can work more efficiently at cooler temps? Just a thought that occured to me at the end there.
youtube
AI Moral Status
2025-10-30T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyABG2BqQo_bQ0RTeF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzwKIBkTIjwF5QgSOR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzp0VQ5QCWvMSJH6-h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXt8u0LAlcm6JcuIJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2mNarWuP2T8jCTfJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCJBabiQ3Iz1EJtSp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwP2sI4oMWXokqcHV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfXKjmHwOdcVoYIAd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxNQQH7JScRsLDbMUp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnUXuXIdWgn0uB8bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]