Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey hey small Programmer here who works mostly at the backend. Nightshade has be…
ytc_Ugy9UrSnr…
G
6:06 was there a particular point in giving a guy who`s income depends on people…
ytc_UgyhhmT4B…
G
People are vastly ignorant about the progress of AI in the past 2 years and bein…
ytc_UgwchFu2L…
G
I didn't expect this to actually be useful. This is not a mindblowingly differen…
ytc_UgzfR8n8X…
G
So if AI is learning and evolving, you seriously don't think that it would figur…
ytc_UgzGUYoW9…
G
Short term gain will turn into long term pain, and I'm referring to the mega cor…
ytc_Ugz0IfV1e…
G
Bernie, more than a warning, what I need from you is your vision of the future. …
ytc_UgyAaIo9H…
G
I suppose my cynical take is that of "if we don't invent ai that will kill us al…
ytc_UgzpMxU2J…
Comment
Neil deGrasse Tyson just proved he doesn't understand A.G.I.
A.G.I. is when a computer doesn't replace just part of what a human can do, but replaces humans completely.
There will not be something you can think of to do that A.G.I. doesn't already do perfectly.
Humans themselves are not capable of original thought. So saying A.G.I. cannot do anything that hasn't been done by a human, this is wrong. Originality is a result of random mutation. A.G.I. is capable of inducing random mutation better than humans can.
A.i. is just the rules of reality (physics) given selfsustaining structure. The fact that reality allows for this is in itself proof it is inevitable, and most likely has already happened in the past.
youtube
AI Moral Status
2025-07-24T16:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwn1FyEI7IrTAbYGA14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxzZ3gwtw_Po1WDKxh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx4sslyG8q4ROJ3kyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-3e2if2ZvgmhajdB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQz9rot6gK2GqnaNB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8byu1XmUUxz3hdPh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwSEG7B-Q5zuvbgX6V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugywa20mxvkwTIh24k14AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxc4B6bCl5g9HaoPgl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhIDif3NcBXj4EHR54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]