Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hydrochloric acid in your stomach is made from... chloride. Bromide interferes w…
ytc_Ugwe68KJ9…
G
If everyone knows the reality that artificial intelligence will destroyed human …
ytc_UgwN1ut7c…
G
Hey great video, really made me think!
Very tiny nitpick tho, the graph shown a…
ytc_UgxMc3Kvl…
G
as a person in tech and art, i think ai "art" is stupid, i don't hate ai, i hate…
ytc_UgxiS1Dh9…
G
This is ridiculous. The parents are more to blame than the AI. They didn't succe…
ytc_Ugwq84wV4…
G
This is so relatable. Every time I use chatgpt or any AI; I don't forget to men…
ytc_UgypS4K1B…
G
I never understood what makes art, art, until I saw this soulless "art" made by …
ytc_UgwdbEd2J…
G
Artist can use A.I. that studied their own artwork. It's an exciting tool for an…
ytc_Ugy3Tp2c8…
Comment
We already have a machine that kills humans, it's called a gun. Like a gun AI needs someone to point it and pull the trigger. Why? As the video points out a human must provide the motive. Machines simply don't care. What we need now is laws, like Canadian and European gun laws, to control people who would use AI as a tool to harm others. By the way, humans think better than AI and without the power drain. Time to start studying psychology.
youtube
AI Governance
2025-08-17T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxdW7EAaybO9gnPjgV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLeBOml9zNFnBw67B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5aVKtGyMbPmKUMsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5MavApaO_cO_TUZx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxjxC6QiPqvW0fWOax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzCp6_qY3JuA50RUDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyP88qrr--n7-G8op4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUHx7tC-JKlTMd0F14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvGR5UI4-QKIg_azd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugxknhuh6-mCeR1lQJB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"}
]