Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, our biases are real and AI will use our natural biases against us even if yo…
ytc_UgzW5jSwY…
G
for me AI is good for astronomy, and very annoying job that nobody want to do
b…
ytc_UgxF7X6E9…
G
Neil and company appear drunk once again. My dog (Mr Carl Sagan) is like wtf is…
ytc_Ugzms-ENu…
G
Always appreciate these videos. As someone in the tech sector it has been baffli…
ytc_Ugw1xiAmS…
G
Ai art is good in some scenarios
Character drawing? Fuck no, fuck that, I'd lik…
ytc_UgxWowBjo…
G
Stephanie, did you try and search for your deepfake? it's crazy out there. take …
ytc_Ugy0lkEFT…
G
This is most likely what will happen. Assuming that these AI machines don't get …
ytr_UgzQ4GzMd…
G
Chatgpt helped me with legal issues when no human, especially a lawyer and the g…
ytc_UgzMAHq6x…
Comment
Id argue there's a fourth major problem. When you give a prompt it is essentially finding what answer is likely based on what it's seen before in training. It's a lot more complicated and mathy than that but that's the gist. Problem is all that training comes from us, if you try and train the Ai off it's own results you start to run into problems.
Soo, how exactly is Ai using current models gonna replace people? Say Ai replaces us in some field. It won't be getting new data, new ideas, new concepts. At best it'll be stagnant.
Current Ai is a reflection of ourselves. It can only do what we currently do, and can't advance on its own.
youtube
2026-01-23T02:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwgiDHcpIe7EOE44Fd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4YV7zvuN8S0qlMv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyYSu47-_ZvBUHwOJp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5pRBBIVYKd-qbv_t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3pY50R0a7Vd7-O4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYTkx7pv7ic0ulhOl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOWE7-D7rNSpckpD94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgvL2b6LE0_0j5mdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9VQ0TzbYBcAFodot4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgySY6I0fV96WWo88tp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]