Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh, god, a book by the harry potter fanfiction cult about AI. His book may well …
ytc_Ugx0MfejG…
G
Ai will most likely win in the end.
Same happened to photography.
As same happen…
ytc_Ugyj4jFNv…
G
So.
Lawyers which didn't check their work, but others catched it.
We had luck…
ytc_Ugx-6G_qD…
G
This is getting scary. Combine this with AI and Terminator judgement day becomes…
ytc_UgzTcllcp…
G
hehe, the next 5 years will determine how the future is going to look like for H…
ytc_UgygA3vsI…
G
What about AI prompted lyrics based on hand written original lyrics, but the AI …
ytc_Ugw5RFxC_…
G
@tomaszcichociemny3758could just as well be that, irregardless it's not unlikel…
ytr_UgxuyIw9Y…
G
He has valid points. I have been following ai and arguing/discussing it as I get…
ytc_Ugy4J1x56…
Comment
Typical socialist BS... what gives this pundit or her regulator allies the special wisdom to determine what is in the 'public interest'? Yes of course, AI brings plenty of disruption, what a hot take! But if activists like this can scare voters into placing control of AI technology in the hands of the bureaucrat/regulator/political class, that's the worst of all possible scenarios. Political capture of AI is a far greater risk than its free development in an open market
youtube
2026-04-16T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxECaIY8YffnlMiVi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzu7u08CrhdjXqEePB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugya_vBrgVnIRZd9nfh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyKS-NcG30KZaYQ83x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyQUwkMcPPMBAv9CNF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxexCxEB5GAcVkpQI54AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgynFyXP6w0pdumDKAV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwu3Q1Pvfi7dM_PZw54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJgm25ImbjBaaKd854AaABAg","responsibility":"government","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyJz3MUOEldWGAv9WZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]