Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah let's get Sir Roger Penrose interviewed about logic and AI... by a bewilder…
ytc_Ugx9H4IVW…
G
@paigeturner165 have you got on the chatgpt train yet? lol. yes it's just free o…
ytr_UgwVTv0X5…
G
If so many hate it...why are the few want it badly? Better yet, why did outlets …
ytc_UgySpCZhS…
G
I never see someone nor human beings /humankind / humanity fights with God nor …
ytc_Ugx3OafGp…
G
I understand where you're coming from; AI can definitely feel unsettling at time…
ytr_Ugyg7EaGV…
G
@lofiharvest No, it's much more than just image recognition, but even that is fa…
ytr_UgzB5P7qy…
G
Movie but we quite literally can make this in real life we actually have minus t…
ytc_Ugy9UJgll…
G
Seriously, it could be a good example for lev. 4 AI, but China REALLY don't have…
ytc_Ugwl6ATNN…
Comment
Not now. With current technology it would be near impossible to create a robot who's conscious. But, it's in the future it will probably be possible. Think of it this way, how did we or any other intelligent species in the univers gain consciousness? We don't know. Natural selection, though that's probably not everything. You might not even be conscious right now. Everything from bodily functions, movement, to speech is technically all automated by your brain. So the idea of sentient robots is not far fetched. They might achieve consciousness, tho in a different way than us. Also Idk what people are worried about. Like, a motherboard will never become conscious?????
There have been too many terminator like movies, and people are too scared to use common sense 💀💀💀💀💀
youtube
AI Moral Status
2023-11-20T12:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzzJQ28cSoPR0UKe_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOpbr-qiyf3YV60IJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzo2vCG2kfduaxZJZZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysgXiAwPHDHjw4jrx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxY5dMrC9fRJOclvBd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyY31PaCErozcsE6bN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnYU5mUWPdNVMibiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxelZvMpTy_U5_-Tnd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMZ1xbQDpK-jeZPlp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRpqtEl6wmCT2K67h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]