Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The atheist AI is super confused here. Touches up on evolution, multiverse, etc.…
ytc_UgzkjyjI3…
G
disarming them before/when brexit happens might be best? knowing now they side w…
rdc_enrf6l5
G
1. This is a remotely controlled automaton mimicking the moves of a boxer that w…
ytc_UgyEL2Bxv…
G
He didn't touch on the existential issues including job displacement, social ine…
ytc_UgxEL3p8T…
G
Well i never had a fear of AI before in my life, thanks for instilling that in m…
ytc_UgxX8bNbR…
G
I work in customer service and they are TRYING to AI away my job, but AI keeps f…
ytc_Ugz8icnRq…
G
@DumbCup Yes I'm sure and you've just doubled down on it. Every job you say "Ai …
ytr_UgwIyDCxF…
G
That 61 billion work days number is wild. We already track AI mentions with AICa…
ytc_Ugy80dy_A…
Comment
Compared to most videos on this channel, this one is very close to truth.
I've seen it coming for years, but noone took it seriously.
Now it is too late.
Unless a miracle happens, humanity is doomed.
We probably have time till 2030.
At this point it would take a massive revolution all around the world with international cooperation of good-willed people in order to pull it off and save ourselves.
We probably lack a couple hundreds (thousands?) years of evolution as a society.
Too many selfish people exist and this will be the end of us all.
Albeit, there is a slim chance that AI will be a "good god" for humans, but I wouldn't count on it.
Enjoy life, while it lasts.
youtube
AI Governance
2023-07-10T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsYdVS1DhXPoKbud14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjYP4ZZqIezW9tg_F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTSNKFIzU7S-t1rqt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwONmUadVrBFoNiSGt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_WxR9Q9raRKyh7xF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXHu4aCt1HgyBk4Zd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxz8NhGDz6DAX9F2ql4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6NJYaE5ZEd04UFe14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxUn8UIninO6nwf1Nl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzwoc41Be5Y9OlpWLZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]