Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI job losses is only a problem for the capitalist economic system. Having machi…
ytc_UgyCTc-_w…
G
Oneness is beyond any form, including AI! What seems to appear in Oneness is onl…
ytr_UgzjaOoSG…
G
At 59:00 he could be doing a better job of explaining Nick Bostrom's argument. H…
ytc_UgwvBHZVT…
G
My mom put a TV in my room my whole life and a laptop in my hands at 13 so she w…
rdc_mvn892f
G
Using AI to extract vocals or come up with some basic riffs or ideas is still ma…
ytc_UgyIV3W88…
G
The saddest thing is that these AI tech bros really think that they're doing art…
ytc_Ugwql74Ep…
G
Just imagine cars having been driven automatically caused so much of an accident…
ytc_UgwC_lq9I…
G
self-driving cars will be at least a hundred years to become sufficiently safe t…
ytc_UgwV10yjc…
Comment
AI technology needs a lot of energy and ressources.
Billions are invested in the hope to profit from the developement in the near future..
The huge hype is although a possibility that tech companies can raise their stock values.
In my opinion everything wich is invented by humans has it's failure in nature, humans are no gods but always trying to become gods. This is the core problem.
youtube
AI Governance
2025-12-14T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxqDpMqy3XdbvMNR0l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz9Xl3XjbVYiI-H0294AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzRhG6GM45eaxmfu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwzAplp7_XLfoOwjtd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCB5SWGd2xAPD_OQx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzLg-yThXvGNTaEbpR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy37gD3NTo5kU5fbER4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz63w5DFgfpig8Dwoh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyWWbArK49Sqs4TNvR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxItpRzmxdU1DC3C-R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]