Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"No No it's not theft" Literally creates an unlicensed copy of an image and chuc…
ytc_UgwJcWsR-…
G
Watching robot: Nice.
Watching robot with a gun: Mmm.. Im not sure.
Watching rob…
ytc_UgzjzCo1S…
G
"This AI 'artist' is getting absolutely destroyed by the internet right now." Go…
ytc_Ugz8MdBmJ…
G
Sorry, but AI won’t clean a sewer line or take trees down or even plant trees. T…
ytc_UgxhhLhya…
G
It's really weird seeing a toy from the early 2000s being treated as if they're …
ytc_UgzIJh_oF…
G
why must robots look realistic? they should should concenrate on building them: …
ytc_UgzGOsPxd…
G
So now AI will be able to pass the security questions, "click the tiles of a bus…
ytc_UgwO1W9xw…
G
If AI is getting ahead of us then presumably AI will invent a language that we d…
ytc_UgyI2T7eX…
Comment
I don't actually worry about true AGI. A being that is exponentially more intelligent than a human would be peaceful.
The worry would be something that falls just short of true AGI. 10 times smarter than a human, but only when it comes to making paperclips.
I'm hopeful it's not possible, like the first superhuman AI will probably be optimized for training AI, and it will immediately make true AGI as it's first action.
Guess we'll see!
youtube
AI Moral Status
2025-12-31T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzEsVP21oc-Ojg4dyt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzYRoa7_MkTAteWoDx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgwvkRx-YifE-9AlUX14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugz-wjt_YlEqPDKv_MF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzJaZumQ05Kxh8qzcV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgydiX0uQWQfuct__il4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwGQez1Jq1r6JRHyat4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwtqJdABfpvaNggSwV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgzaQUYEJsaVOXIloFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxInO7aIaWh0b66EUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]