Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Technically if you generate enough of those and mash them together to have them …
ytc_Ugzm4MYMB…
G
While the concept of robots taking over the world is a common theme in science f…
ytr_UgwbarxFr…
G
2:16 can people stop using google's *AI* overview as a source for things? Sure i…
ytc_UgzfriYuV…
G
I sometimes wonder if this man isn't real and everything in this channel is crea…
ytc_UgwepCPOw…
G
It's sad, but the unfortunate truth is that A.I will replace your jobs and other…
ytr_UgxCf18nq…
G
Saying “AI can never do X” seems entirely shortsighted, time after time this sta…
ytc_Ugx5FWWui…
G
So I've been finding AI images, some with obvious defects, running it through ni…
ytc_UgxqkJVao…
G
I do believe the government should have the tech for one reason to get ping if a…
ytc_Ugz0FpIsm…
Comment
@matthew_berman Yes, but why agree with a pause in your video? We have had DECADES to think about the consequences of computer AI. We don't need to pause now that it is finally moving forward where we can actually live the future we have been waiting for.
Think Matthew, where was the OPEN LETTER TO STOP AI DRIVING??? It was never there. Did AI have some accidents while driving... yes it did. Did 1000 CEOs and Tech People run screaming to stop them from making cars that drive by AI? No... Where was all this "concern" when actual cars were crashing? The reality is they just want to use a pause to catch up so they can do it too.
youtube
AI Governance
2023-03-30T04:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwvtUjccGFfPIV6nwZ4AaABAg.9nrnZpaNGkR9nsY-cBEF5A","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwvtUjccGFfPIV6nwZ4AaABAg.9nrnZpaNGkR9nsbW4K7MBw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrn4m0trjv9nruHw0PaSq","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9nrle7iFSFV","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ns6RZtoKDO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9nt0NShWS_s","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ntJ7Qv2sCu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ntlXtisU_-","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugw6EtfFGqbU3EKNXFx4AaABAg.8ebBLFhnP-u9TQaU28JdPc","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxyULC5OslX0G74cJx4AaABAg.8eZkIXf7xt38e_xmX9IADA","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]