Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm glad they're being very cautious about it. We don't need AI, in my opinion i…
ytc_UgxKhmLlD…
G
AI can't replace art. But only a few pictures are created for the purpose of bei…
ytc_UgwPk91gk…
G
lol all the fear mongering. AI won’t take people’s jobs cause AI is impossible. …
ytc_UgwhNDpi-…
G
As an amateur artist and animator, who is working on making it my career, someth…
ytc_UgyI5LTNP…
G
I literally dont see anything wrong with what he did except he didn't tell him 💀…
ytc_Ugy0NtH1K…
G
As quiet as it’s kept, AI is not making any money. It’s just costing us our envi…
ytc_UgzDHlLwg…
G
@Arkansym except they're not copying the work of the AI, they're taking somethin…
ytr_UgxEet2H9…
G
C'mon. . .AI "learns" like humans do as they grow up. The "problem with AI is g…
ytc_Ugwl4ISxP…
Comment
Hinton is saying something inherently contradictory: AI will make health claims processing and medical care 5x more productive, yet health providers will keep their jobs, yet claims processors will be needed 5x less and be eliminated. If you follow the implication of his productivity argument, there will be 5x more claims to process, resulting in no net job loss in claims processors! So effect will be neutral … however, people receiving 5x more healthcare will be healthier, capable of more consumption and economic productivity.
youtube
Cross-Cultural
2025-10-03T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzZl11bkSUvRHEBV594AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFaK-sCTx6N-EMkO14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2l7-5GPGF87OsPCZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZaybg8CjeR1AaSEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYbXh44p63k11dLnt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb1I8bksAHXN251ih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwd-sXJHhHTdHiwHRN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwhkO8iJnqOaiRVkmJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxTek9oh9wZ7pNc-sR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_VU9j6kJk1MFeJd14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]