Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hear the real issue is that they input the prompts and get a response but they…
ytc_Ugx9rA4oX…
G
The best way to handle AI is don't use it and push for it's banning. In the long…
ytc_UgxAfQGU1…
G
Oh hey, I know what echo chambers are and their dangers, but I didn't realise LL…
ytc_Ugx9oN9oM…
G
Nxgga you don’t have full self driving (supervised) activated .. that’s why you …
ytc_UgxfWx-HJ…
G
Do you mean like http://en.wikipedia.org/wiki/Distress_radiobeacon?
MH370 had 1…
rdc_cgflokv
G
I kind of disagree with the recurring argument that “it’s just math and token pr…
ytc_Ugz99QFA7…
G
I think there is a solution to controversal preferences: I'll will elaborate the…
ytc_Ugxm8qUp_…
G
one positive thing with LLMs and AI is that it could improve literacy. it gives …
ytc_UgxvTvNGE…
Comment
Funny how AI voiceovers also have a lisp ... particularly that one near the end of the video.
Also I think the fear is exaggerated. There's no way that whichever company is making an AI does not build in a "death" code, that can shut it off completely, like a simple word. Finally, no AI would be immune to the electromagnetic pulse, while humans would be perfectly unaffected, no robot can survive in the wilderness of the tropical rainforest or super high elevation, but a human just might. Humans can come back from "hardware errors" (brain damage etc.) however rare and unlikely in a given situation, an AI powered robot - CAN NOT. I think we're overstating the dangers of quantum computing in a world that's actually threatened by real-time extinction events due to climate change and pollution.
youtube
AI Harm Incident
2025-08-31T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugx9IJ6W1_aveWbTvVt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx_SflnCmkA9CuhueN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},{"id":"ytc_Ugxtr0PfoamZbfjv0Kt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzgdVsvR09s35jElsl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_UgxkKcBHELXytPVPU5t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_Ugya7LvPEJijd-w01Gh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxQcWmToNmZnvz-wXF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyvzDSxADB0Up5P8Gp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},{"id":"ytc_Ugw8tEFHmC-HxWOGOjp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyZICCQUwqTmj280HN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}]