Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haven't we learned anything from James Cameron? Do you think we can really stop …
ytc_UgzkDu55A…
G
Have you watched Pantheon yet? Hopefully AI doesnt go quite that way, but it's s…
rdc_l5l4uo3
G
Humans can be pretty bad at driving. There *will* be an AI better at it than us …
rdc_f6xkbda
G
Bc the people running AI companies have bribed the govt to prevent them from hav…
rdc_nueff5b
G
Yeah its called algorithmic bias, it works how when your testing on ai, its most…
ytc_UgwWOzlHJ…
G
And?
So ChatGPT spits out your office drama, and it’s a good book because it’s …
ytc_Ugx3xFahK…
G
OP: you describe people who think AI as conscious as “outright delusional”.
I w…
rdc_mlhm9si
G
This is the goal of the World Economic Forum:
They want to establish an automate…
ytc_UgwUmxxx3…
Comment
The real problem will come when we make such an AI that had a physical body. That said I don’t see this mechanical monkey doing or understanding the amount of engineering and technical know how to repair or upgrade itself, only that it is not human. Maybe a bit more advanced and it will be as advanced as the average person today, obviously with a greater capacity to learn, but we will have advanced by then as well, and truth be told it might consider us some sort of divine creator as we ourselves have divine creators in our religions
youtube
AI Responsibility
2023-07-11T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzA18tE6LrKe8nBxkx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgxPjkkrCLDu8S5-Ekh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzsaSCogCsUHOh4CEV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgyCufl9xA8XRPLvyRF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugzr3aRs15foRgVrpU54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxbB_mgCGnfn1652Xt4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},{"id":"ytc_UgyLMMzdv0wjq2luTkt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugx1kopT8dhj-0NNSqt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxok5zqhaGFc0lBLhJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgzjUSWzaX6yHR2dGi94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}]