Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We cannot even regulate social media such that it remains fair and uncensored. …
ytc_UgyTMbkPP…
G
We appreciate your perspective. The development of AI technology, including in a…
ytr_Ugydqc9gK…
G
Well, yeah you baited the AI into giving you the worst case it could "imagine". …
ytc_UgzGjPc9F…
G
Using AI to take orders is not supposed to simply be hiring an LLM, but using an…
ytc_UgzV_b1pJ…
G
9:20 she revealed her programmer.
A.i. has been around a lot longer than we a…
ytc_Ugxy8wkG_…
G
My heart goes out to the family, but you can’t blame a mirror for showing you yo…
ytc_UgwbtluIT…
G
I am not tired to hear this. On the contrary, when will AI replace Modi?…
ytc_Ugwah9Tno…
G
Question, if you post a video about it, couldn't someone screenshot the art from…
ytc_UgzoPuoz4…
Comment
I'm more afraid of the extremely large amounts of electricity these data centers require than I am of AI itself. Google, Facebook, Microsoft and other tech companies are investing heavily in fossil fuels--including coal, the most polluting of energy sources--to power these massive data centers, going against global scientific consensus regarding the urgent need to move toward sustainable energy sources that are not seriously altering the contents of our atmosphere. Of course companies like Palantir are extremely dangerous and controlled by misguided people whose lust for profit and power knows no bounds. But the more immediate danger in my opinion is environmental. If these companies are allowed to continue on the path they are on, climate change disruption is going to get far worse. We're being warned by science. The irony is that our technologies, which exist because of science, are going to cause increasingly more environmental destruction and chaos.
youtube
AI Governance
2025-10-19T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0XbWExcw1UATlrCZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxdd5LhOa4BqgfiVYJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwXhVfzoMiIICL_VrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxn71dq8OryH5hEXGx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPVQ3YuLhG9kEyktR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxVF3BV6PccJUZLrGh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCuyjxFjBVQedL6wB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnSbFOhb6ZTUZkOSF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwt6IhKX3j2vNFEkI54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwlkvU-6T5Cs7xrl5d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}
]