Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love you, Bernie. But unfortunately, the people in power will never give you w…
ytc_Ugy5Uw_-Y…
G
Bro how can you even call yourself an “ai artist” you did nothing but enter a pr…
ytc_Ugwne1LaG…
G
why does he say "i'm not an expert on any of this stuff" (12.20 min) ? He 's the…
ytc_UgwSNadbm…
G
From chat GBT:
Here’s a fictional speculative article based on your concept. It’…
ytc_UgxthWhW9…
G
@SaulEmersonAuthor I use standard chat GPT and he has this kind of memory. You …
ytr_UgzPx5g68…
G
If this happens to me ever. They better be prepared to pay the consequences. Id …
ytc_Ugzpb18-7…
G
Ralph Nader interviewing Shoshana Zuboff about her book "The Age of Surveillance…
ytc_UgybFpDrZ…
G
the only thing majorly wrong with ai art is it uses actual content created by pe…
ytc_Ugzzqi4VN…
Comment
"And according to many leaders in business, science and technology: we're in the final chapter. The part of our story where we finally go extinct.
And there's nothing we can do to stop it."
I have not watched it yet but it sounds like bullshit... If the now is the now, and there would be infinite possibilities in the now. If something like that would ever happened, it was a choice. It could be stopped in infinite ways, then. plus infinite solutions. Earth could live forever in infinite ways then. Plus A.I could get into infinite directions.. hmm. Even if. Earth and people could be saved in infinite ways. Not being rude... that's just my thought about it right now before having watched it.
youtube
AI Governance
2023-10-17T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzxbxHJAcifBnjrXFB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwdEblPYRqPehPs89t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyu-abhvc6AruOBnYF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfAgck95CZ61cUWHx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy9cPhfbEnPWi8-5VR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwjk_yIEloaa7T0Zrl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjtbzBJglKDhZvAI14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxqhMYzeYmoF3cwOOF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxRVEF1Qxa1uE5MyZx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRadCRC22lKq_Q_Xt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})