Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I acknowledge that David Deutsch is smarter than me, but from the perspective of…
ytc_UgxiwzcZp…
G
AI Deep fakes news happening
❤Hope there is AI law soon to mandate to all publi…
ytc_Ugx88mr2K…
G
ChatGPT has such impeccable listening skills😅..Avery insightful video Ruslan tha…
ytc_Ugz5_G82T…
G
Have the 12,000 nuclear weapons stacked like firewood across this planet suddenl…
ytc_Ugz2p9zVi…
G
Why does everybody have to work? If AI is doing all the work building, transpor…
ytc_UgwqieHUa…
G
Super evolved AI ? And "it" was created by ...? Super "entity" ? And this entity…
ytr_UgzkJAJ5P…
G
@mschribr Let me use my profession as an example.
As a lab technician, my job b…
ytr_UgzfUKK7l…
G
the left one is AI, i can tell because it botched the LG logo on the bottom left…
rdc_oi2elpm
Comment
This is an interesting theory; however the model you have presented does have a few holes. The problem with an ai takeover will become consumption. As in the products a company generates be it physical, service-based or virtual are ultimately designed to be consumed by an end user who tenders currency for them but if the only ppl able to consume these products are a fraction of a fraction of the population then these companies will inevitably consume themselves.
Secondly, there will inevitably be issues with scale. As the demand for ai increases so will it's cost due to availability dictated by the power grid, availability of microprocessors etc..... there will exist a point in which there is no profit in replacing humans with ai.
Not entirely nor as quickly but over time it is certainly possible if not probable.
youtube
Viral AI Reaction
2025-12-06T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy5Y1md8zBTQDShez94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwLHeUXnyzg1t_leAd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugytl0F5Sa8J6Shq6ud4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzreNYqFmJ6GNV8d8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBZtDhqRvHKS8ffuN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQWtFPcBmC5_IpNXR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxPyLv4b_rfAc-mLwl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjF5aIOdQLfpDEHc14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLHdqBDzxjiFaI5HF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-KQ7cTDzDpDrOLQd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]