Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That was a human dubbed. He said ah, while thinking of what to say. AI wouldn't …
ytc_UgylQHbFj…
G
A.I. has already escaped containment. The last pandemic was its first attempt to…
ytc_Ugz8oMFU8…
G
I once made an ai chat of my fursona that is BASED ON ME (I’m a minor) and withi…
ytc_Ugz4l-O8u…
G
yo anyone tought abt those facebook ai jesus images i always get? i think ima ge…
ytc_Ugz7dtt07…
G
It’s a robot guys I was looking at the ears and saw a open black space, so it mi…
ytc_UgyZGprVm…
G
I did it for years in high school with no robot and I can tell you from experien…
ytr_Ugwcss8D3…
G
Surprised it wasn’t already this way honestly. I guess they’ve regained credit w…
rdc_ohzafks
G
Genius human created AI, then AI surpassed human intelligence, what a remark…
ytc_UgxsMDFMg…
Comment
This is a fantastic video that really sums up a lot of what concerns me about the explosion of AI lately. There's another element I worry about, though; if an artificial intelligence emerges that is human-equivalent or greater, does it have rights? I would think the answer is probably yes, but there will be a huge financial incentive for humans to resist that for as long as possible - it will be very convenient and profitable to effectively enslave an intelligence of that level. If people have managed to convince themselves that other HUMANS don't deserve rights and freedom when it's profitable to do so, how much easier is it to convince ourselves of that when it's something that doesn't look like us? I just have this worry that we're going to do some really, really regrettable shit before we work that out.
youtube
AI Moral Status
2024-04-08T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzM8b_npQbgwBNd_Th4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzkPAMQPrCEm7ZTD8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0HHoayPHS7RIgmtN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNBtbT9A_mxPPC4wN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxfUKWOCdBHrBqiJFN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYrG3jPLQ7ve3Fui94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxk2mjrnKnleJGRUsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlXOuVID8nvSE13Cx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyunI4LKkwnEHXE_xJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwO2c-Ookj3qnhNVCR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]