Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it said the cia fbi and nsa did it and gpt -5 said there are 600million in china…
ytc_Ugzh-ioC1…
G
34:45 - imitation, that’s exactly it. We are building something that will imitat…
ytc_Ugygsx3SC…
G
Those are real people dressed up as Ai Robots to get publicity for a movie they …
ytc_UgybzZ_nU…
G
i bet you anything, they're going to pay for automated/AI tax when that time com…
ytc_Ugwetrywa…
G
Anthropic is valued at $183 billion.
The $1.5 billion award is nothing to them.
…
ytc_UgwfCB5bX…
G
this "real relationship that makes it interesting" stuff is completely subjectiv…
ytc_Ugwloh_QY…
G
There’s a fundamental problem with the idea we'll be sacked for AI - AI doesn't …
ytc_Ugy3d3QND…
G
I feel like consciousness is way overhyped. My brain is not that different from …
ytc_UgxbAnSWx…
Comment
The addiction to money and hoardings it is the root of the problem. Everyone wants more money more money. Some have so much money they can’t spend it all before they die even if they tried. Explain to me the point? So you can leave it for little Timmy who you’ve taught nothing. Why is someone making 250 times more than their employee. The same employee relying on welfare…an that’s okay?! As a small business owner I’d be embarrassed to know my employee doesn’t make enough to live without help and I have a surplus. Allowing money to come into the AI space is what will ruin it, just like money always does. Name one person that has on their death bed said….man I wish I would have made more money.
youtube
AI Governance
2025-06-18T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxxBepumWmq66J82lJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzF4NP4gmbzxdBCxRJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwltPBuOgU3QhEu2_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugynn5YXgxQ5bl9YugZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLI3IeVuAullznqUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjfOTPimryYJIJgZd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzIajnVzdKD_5rx4wl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqPLQXSwGC-T2JT1V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJeamRsX2B2C5qVH94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzzSj1hCEQERqYictR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]