Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, good points on wether or not we have a sentient ai on our hands right now, …
ytr_Ugx2ewQws…
G
We will have no purpose thanks to a few elite lunatics that want control over pe…
ytc_Ugzcw38LL…
G
A reminder to everone to read the Vice article "AI Isn’t Artificial or Intellige…
ytc_UgxlfGKaS…
G
Rather surprised you didn't touch on the combination of AI with quantum computer…
ytc_UgzJrxhdI…
G
That's a question I've wondered for a while. Without radical change in how an ec…
ytc_Ugy9P3nzV…
G
I'd say the problem is not with AI art itself or with it feeding on artist's wor…
ytc_UgwGlJ1R3…
G
Control: AI's capacity for surveillance and control is viewed by some as a poten…
ytc_Ugwt4LdVT…
G
How do you figure?
We have an estimated 40 million slaves in the world right n…
rdc_neprop3
Comment
@michaelreed4078 Asking people if AI will destroy us now is like asking the Wright Brothers how safe it is to fly 2 months after their first flight. Ofc they're gonna say it's dangerous as hell.
People are theorizing as if we somehow invented Superintelligence "tomorrow" how dangerous would that be, but that's not how it happens. Getting there takes tons and tons and tons more iterations of AI, and as we get closer and closer we will be able to refine it and become more and more confident about its capabilities.
youtube
AI Moral Status
2025-10-31T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwCYqWq4Qbl8PSoD514AaABAg.AOv8jxc80nxAOvJ4WdT3mU","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz-7kyDNsbzE1_2scN4AaABAg.AOv8h1fhwUnAOvAazTjgAA","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz-7kyDNsbzE1_2scN4AaABAg.AOv8h1fhwUnAOvEVh4mrm5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwZodMs5G-ScGJk6NJ4AaABAg.AOv8B77ai73AOvAx5LxVRi","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzy2yuIDLIM_CF3lWN4AaABAg.AOv85PiQaTMAOvIOFE6C6f","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzy2yuIDLIM_CF3lWN4AaABAg.AOv85PiQaTMAOvImtPifam","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzy2yuIDLIM_CF3lWN4AaABAg.AOv85PiQaTMAOvIyv-UHR0","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugzy2yuIDLIM_CF3lWN4AaABAg.AOv85PiQaTMAOvJGavgURI","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugwz6ZtSJ4_bZauzZ6N4AaABAg.AOv8549-nIGAOvCSg2YlKp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzU0EmuZ55E9T-ENeh4AaABAg.AOv80pWyMdBAOwGvZFHVRI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]