Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One thing I always thought about is - has anyone looked into how Autistic people…
ytc_Ugx515sxp…
G
I'm a lawyer and I've found that ChatGPT has actually been mediocre to bad, but …
rdc_jhb07ds
G
Hey Varun, I have been watching you for a long time and also read you book Pyjam…
ytc_UgxwW3-Ns…
G
1 de que aceite tomaria ? 2 si ya saben como se pone pa` que lo…
ytc_UgxxS8sDg…
G
AI thinks it is conscious, just like we do, but something is missing. The Creat…
ytc_UgwRA0xqZ…
G
i had this mostly written and I accidentally pressed the cancel button but here …
ytc_UgznBNboE…
G
As a programmer I’m not in the least bit worried about AI. I use it to make myse…
ytc_UgyKJUlHr…
G
What if we could guide artificial intelligence to reflect unconditional love? Wh…
ytc_UgzuBvcEe…
Comment
Whats took 20,000 years for humans would most likely take them 10 years. They don't need to eat or sleep, they don't have to worry about financial problems, health problems, or any of the other things that has slowed down or stopped innovation or a genius from continuing to grow. And the biggest implication is they don't have to worry about death. Ai will eventually be a run away train.
youtube
AI Governance
2025-06-28T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzaIMIHReEDMImKywl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEHMeks5sr2FIgBhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxao9NwHv-iNsnnlr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwGXRiF1_7tZOnexs14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-FSleuiO0i5ih5uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqXbkevsi6Dtas9-d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwhxvnmg7mXiXfrV0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1SLU2BSm531PgowF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxDp0uqlJbhTHnzp554AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyGNZhOxJN4kYmxC3V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]