Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People saying that's why asks whether you're human or robot but wait! They ask t…
ytc_UgxsPf3bE…
G
When it comes to the thing of "what isthe point of substituing everyone by AI",…
ytc_UgzO26Xfp…
G
Guess I'll be doing repairs on computers without people operating on them...
Be…
rdc_mxzd2wb
G
Someone has to decide if the AI written script is good or not. That person is th…
rdc_jj41eql
G
at 18:56 understand what he is saying and the ramifications if we rely on the AI…
ytc_Ugznzm3Yx…
G
If you like video games, than normal ai is bad too. Ai uses RAM, making the RAM …
ytr_Ugw7SiNln…
G
School should use tech to enhance their teaching method not spying your kids. Le…
ytc_UgxNUzMLR…
G
Why do I feel like we will be to AI what dogs are to us from this conversation?…
rdc_kvx4f3r
Comment
I'm not so pessimistic yet, although as a species, we tend to let our hubris and greed dictate our lives until it is not sustainable. We rarely , if ever, willingly take the middle ground. Throughout our history, we've bounced from one extreme to its counterpart.
Twenty years ago, who would have thought we would be shackled by draconian restrictions on our health and bodies ?
Since we are nothing if not predictable, we will let AI plow through our lives and society until it is not sustainable for humans. Then, like the proverbial cockroacĥ we will scramble to contain it with extreme measures .
And likewise, I would rather not contemplate the casualties.
youtube
AI Governance
2025-09-05T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxp0ZlWI4djRl2sr-R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwjUXlAe5RmgYbzpOx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzaBTInx4boGfnH6Wl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzGtSV-tdw-CNcLjih4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzt375EKNHPlouDnfd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFlOgdJjIedEFmZvN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxRo9NTc-B89eD8IYZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzks-SxUuCSXYWYObd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwd2cfv-brAELEYc5t4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyvm246YfAhtnirYCZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]