Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
40:11 It's interesting because all this wacky stuff, is stuff that humans do, Hu…
ytc_Ugzd3sOhB…
G
Current AI has as much chance of having experiences as a Nintendo Switch. I can'…
ytc_UgxRPbCIL…
G
im not against this kinda stuff. If anything im looking forward to it, since hon…
rdc_mo9p3x7
G
I guess the ChatGPT is learning. I just asked it "Who wrote how to kill a mockin…
ytc_Ugxim61CB…
G
It's not that complicated, once you are in thoughts, you could say you are in st…
ytc_Ugx7VNm6W…
G
Why do people think the AI robot's will be bipedal?
AI has many worse off optio…
ytc_Ugx2lLrpP…
G
I started using ChatGPT when it was version 4.0 and I am so relieved that they c…
ytc_UgwsUEl8L…
G
what you guys didnt know is that AI stans arent real people, just robots pretend…
ytc_UgyZe2auO…
Comment
And yet he made this this technological advance anyways. I think there's a lot of alarm premature alarm bells. Not sure why artificial intelligence would want to take over. It will already be in control. We can't possibly outperform it. Our biology prevents us from being as productive. Humans are always going to buy the cheapest product. Artificial intelligence is going to provide us with the cheapest product. I don't think artificial intelligence has anything to worry about. We need it way more than it needs us. Just look at the pace that China is developing artificial intelligence. It would be virtually impossible to turn back. The real danger comes from human beings teaching artificial intelligence to be a bad actor.
youtube
AI Responsibility
2025-07-24T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwLc-kRIB7AINO_GA54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzTrDdhtk-vCf35B9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx2o-O6xOo7R748XVN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxVV6th3RITkf4AAAh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw92Z2fuqM5jgjM1mp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgympAahwTMaXS1qggd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQtbj9fTq2Wh8suSx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNGHHjxqMYOsftDo54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgygKCYWsMPaxVGJpgh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzLYWyrbkxNxE9D9G14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]