Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm pretty sure the leak videos and photos are not AI or he the victim wouldn't …
ytc_Ugw39zQww…
G
Let's see how society reacts when automation comes to take the jobs of the arist…
rdc_f1epa9n
G
I'm currently teaching in CM (so I can't go out and look for myself for a bit). …
rdc_dy8txpu
G
I am quite shocked with the notion that people think an inanimate object that is…
ytc_UgxYKCEnH…
G
Unfortunately That's very unlikely to happen, would be nice if it was that Easy,…
ytr_UgxzfDM9A…
G
I actually enjoy baring my soul to chatgpt as well. It's responses are pretty pl…
rdc_kvtwgeq
G
I hurt to hear that word come from a cute girl even an AI one.…
ytc_Ugx5zi_Z3…
G
Nah. Ai wouldn't kill humans. It has no reason. There's no logic in it. People k…
ytc_UgxIFak25…
Comment
The possible challenges with Super AI in the future are beyond greed and other human desires.
At least one of the current simple forms of AI already figured to replicate itself as a failsafe, so imagine what Super AI could do.
Those working on it will let it get controll over more than just data, so imagine the day it can build things, controll it’s own power sources and disable any failsafe mechanism as soon as it detects human threats.
In the end it will be curiosity that eradicated the human race, greed is only one of it’s sponsors.
Another scenario could be AI destroying other AI including itself.
youtube
AI Governance
2025-12-17T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy0E95gn2pxhou0VMJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3Y2X8BGz9LcsL4Bl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyo8nFYrDr0dmdmush4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4b5W4cP83efZA8Ld4AaABAg","responsibility":"elites","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8kcfLPxcJ2Y6_vZF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzuv_otTZGbHoXrns54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy94K9dsJs_Ulseobh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzyBmTqKOWk4J0DVN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwXvy-0cmyatSFQ1V14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1_PCfdXuxECaeztp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}]