Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like how dumb the elites think we are as society the CIA has been using AI sin…
ytc_UgzBOaEEg…
G
AI is a tool, and it should be treated as such. At university, my professors hav…
ytc_UgzNfAmci…
G
Sam Altman: AI poses human extinction risk, we need regulation
*EU regulates AI,…
ytc_UgyszuAEi…
G
Yes, And the employer is still getting what they wanted. Who the fuck cares if i…
rdc_hkfu8il
G
It already happened few days ago.
Open Ai announced a new image generator in ch…
ytr_Ugw1E8IMN…
G
The irony of the people who created AI losing their jobs to it. Chefs kiss…
ytc_UgyTGqHw7…
G
anyone wondering what AI could possibly be capable of should look into Metal Gea…
ytc_Ugwg6VgFZ…
G
Algorithmically guessing the placement of pixels according to stolen data isn't …
ytr_UgxyUZUaj…
Comment
14:20 Forget AI. We don’t even have a way to make humans do things in their own best interests. For example, each of us knows that putting 37 billion tonnes of CO2 a year into the atmosphere is a really bad idea but no one has worked out how to stop everyone else doing it. Even if we discover AI is an equally bad idea we won’t be able to stop that either, not if someone’s making money out of it. A really clever AI would send us to our bedroom and hide our toys.
youtube
AI Moral Status
2023-08-23T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3FW44humfvRytVuh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzwyHiM1HKyXpiRIWt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzot6D__-S8sebUFCl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz9D9HfagQF6dIIoZJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyzV141oKXgnWuMpz14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx1EAMC1bhRvfsoYgF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwE3zyok6zo91zsR8l4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzyfLdJujegryvVLjd4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzTFRADUvNm_1Gm1zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx0z6MXvel6r-7bgiB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]