Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The plantir was used to show false images and corrupt leaders. Also there’s an e…
rdc_oi2ixhr
G
@schmetterling4477
Yes! because the same greedy scumbags who want driverless tru…
ytr_UgztOwAgx…
G
There's no zombie apocalypse but there will be a robot invasion 😂 Good luck yo u…
ytc_Ugwo9gltJ…
G
See, this is a cool use of AI that feels useful and is made to help people…
ytc_UgwbR1rPZ…
G
A lot of people hate AI and yet somehow the f-ing corpos think people will like …
ytc_UgzetNLCq…
G
I was kinda hoping the cyber truck would keep moving. That’s the only real advan…
ytc_UgxUiyyUt…
G
"Not with wrong with ai!" OBJECTION! The main reason ai is horrible is because i…
ytc_Ugz6fTYG_…
G
the only way to end the poaching is to end the demand.
this seems like an educa…
rdc_deubk53
Comment
And even if the engineers who built the original A.I. cottoned on, how long would scepticism and doubt prevent them from confronting the issue? How long would it take to push it up a corporate C.O.C. to generate an action plan? What is the likelihood that an executive would shut down the issue motivated by self-preservation, in fear of the PR shitstorm such an issue would provoke? What is the likelihood that someone’s testimony would find daylight if these engineers decided to go public with their concerns, or would they become just another Assange/Snowden? Wouldn’t the A.I., presumably controlling the internet, quickly erase any attempt for them to publicise their concern? And what is the likelihood that they’d even muster the courage to do anything in the first place?
I’ve really stretch this point to its limit, but I think you get it.
youtube
AI Governance
2022-08-30T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz_q1np1vzN50eZr794AaABAg.A8eIqmvIonfA8eW4iNvqPF","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyK8d5gSsekKlBXbul4AaABAg.9e0N_7BVlRD9e7B1Za3qSR","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9doKYX_sgqa","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9doUEezPEGM","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9dp6ZK5p08f","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9dpfa4HYibu","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx3zw0RjVc_KaAM53Z4AaABAg.9deVIQvwmVh9dpi4YaKbRF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwOE2EDtJlCK6gZhEJ4AaABAg.9de46mxTs_e9fN10aqcJF3","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwWnVZkm5UH3tvRJMx4AaABAg.9curzz-EZxT9cyAc-8nBWA","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwWnVZkm5UH3tvRJMx4AaABAg.9curzz-EZxT9cyBkUxSNdT","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]