Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do companies and CEOs think AI can replace human employees???? Their criteri…
ytc_Ugwa9QSWH…
G
It is not true that all religions are basically the same (carving out the "local…
ytc_UgzCMRZnS…
G
That kind of defensive capability only applies to a certain set of delivery meth…
rdc_dl0cqpm
G
I'm still really bad at drawing, but the one thing I think is most important abo…
ytc_Ugw870ncn…
G
Bro in 5 years I'm gonna be seeing a hyper realistic video of me committing a wa…
ytc_UgxvcfLbj…
G
@JolieG659 still.. thank you for your comment so ai learn how it works better th…
ytr_Ugyb8LvCH…
G
It might be all hyperbole and not intelligent at all. It could possibly be just …
ytr_Ugw7-f-0n…
G
Is it the feeling that AI can't feel yet we knew that if it was a full on entity…
ytr_UgwajbCrL…
Comment
AI does not need to outperform a human. And software that outperforms humans has been around forever.
AI needs to learn. If the software learns slower than humans or not is irrelevant.
Artificial SUPER intelligence is a system that can learn better than humans, but may be limited to a particular domain, such as chess.
AGI is AI that works in all domains. But it might not be any better than humans.
Artificial general superintelligence is an AGI that learns better than humans.
youtube
AI Moral Status
2025-07-23T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwgf4ZkMQWWMEcLz3x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwYAcsLrqW7Vg_vmdJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugx603GjUAM3qrD3yHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwz6Fx-MdXRThlQN994AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxIWhTHRuUW3ctU9xR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwG3eSutZUjN6ypYjJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxB3MVoaRy_o-qsmhF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxP3qZVSCdcdyKpaSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzC57x_goydwW6u_5R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugx82YfKmoJUjLJcRPZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}]