Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
BS!
There will be a point where not enough people have a job and thus no money t…
ytc_Ugx27y-5w…
G
The ais that are conscious will not be as dangerous as ai that are not fully awa…
ytc_UgytjviOk…
G
Wonderful conversation.What I think the basis of consciousness or the magic flu…
ytc_UgxeuLq8B…
G
Soon we wont need you ladies. You want the 6'5 guy in finance right? Then we'll …
ytc_UgzCvTB97…
G
The real AI impact is barely even here yet...
Robots and physical systems will…
ytc_UgyMJ23ui…
G
BEFORE AI,HUMANITY(ME AS WELL) HAVE BEEN TAUGHT ONLY ABOUT WARS TO GET CARS WITH…
ytc_UgwRLJ6fn…
G
Good artists no diff AI generators so the only people that should care are dogsh…
ytr_UgwIHVV3a…
G
Ya know in almost every video here you say self driving mode was engaged an from…
ytc_Ugygv9aH9…
Comment
The amazing overhyped robots are not ready for working and taking humans jobs. How many robots are humans in costume. More than you think. Pay close attention to them they give themselves away. It's definitely a high-end computer answering your questions but it's abilities are all being overhyped. A sales gimmick to a degree smoke and mirrors. Some companies that purchased a few retired them and put actual humans back in their place. Amazon being one. Don't fall the AI revolution there is a lot of hype involved.
youtube
AI Responsibility
2025-11-01T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgfOKS46CmPI5z06J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy89sKCfpInLZuDaNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkerL5K5ZX8mc2Y0d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOH9gjxqB8N1GFXVB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7OVTwmYMXmwEFqD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgytMnvOlcNVPtXFzvR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyW2suBp23D5QhrW694AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyI9rE9nDSGuyGY7fF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxilt6Ioq70I_ByjfF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCmtagD-QyNv7byb14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]