Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> her husband “finds it entertaining to brake hard” in front of the self-dri…
rdc_eczfbgm
G
generating ai images is fine, as long as you don't claim that you made it or you…
ytr_Ugw3o9o0I…
G
What's the problem here? Open AI just cancelled his H1B and terminated him. That…
ytc_UgwcsllyU…
G
In the Philippines, if an AI invasion comes, we will def survive 100%
FREE …
ytc_UgzKzIZd7…
G
Hilarious when the skin is removed to reveal 😬 and 👀 in a 💀. Suddenly that man's…
ytc_UgxbnHwAo…
G
Eh women use your secrets against you , ai helps me. To each their own.…
ytc_UgwVLZi-9…
G
At best, AI generative art is good as an idea generator when you have writer's/a…
ytc_UgyNIHOCU…
G
This is the dumbest video my idiotic algorithm has ever fed me. Clankers are not…
ytc_Ugze_q5i5…
Comment
I think eventually we are going to have to put restrictions on our “thinking machines” (if you get the reference then you are a chad).
I mean ai is really cool, and it can do amazing things, but is it part of the human story? Well that’s something we’re going have to decide. Do we want our legacies being automated, our thoughts autogenerated, or our jobs being taken? Is the reward worth the entire transformation we will have to take our society through to get to it?
Personally I don’t think we should let ai grow exponentially, it should be limited in its ability, and it should be there to help not to completely take over for our thinking.
I believe if we let ai take over our thinking then we won’t correct the issues in our societies, instead we’ll embrace them.
Thank you for coming to my ted talk.
youtube
AI Responsibility
2023-09-18T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxv1Ft65Rge2P_jKFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxz5S52dWT58v2uCMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyeblPSmQAhFYjOXj14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugza_7ZkLxkjERxhF5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxOyWmuo5rg56r7Ul4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwihX74sI8dN-pF7fx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyemBzF0acSXLJGfwx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxW3cJr796ECWBCabF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoYAr0hecqx8BB36J4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzUgaJn9lFf55PFB8J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}
]