Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Before using AI, you need to learn how to think. You have a tech outage (let’s …
ytc_Ugwae3aU_…
G
Hey @angelgunzz, thanks for your interesting comment about AI! It's like having …
ytr_UgwkdL_WM…
G
Why don’t AI “artists” just compete against each other. That way they’re competi…
ytc_UgyUL4fCZ…
G
I think your example lacks any actual grounding in reality. First and foremost, …
ytr_UgzZAlbgD…
G
What if we already live in god like AI simulation, and within the simulation a n…
ytc_UgzL0K9mJ…
G
Honestly even people who are pro-AI shouldn't think that it should be copyrighte…
ytc_Ugx_SNSGz…
G
DALL-E 2 is an AI model developed by OpenAI that generates images from textual d…
ytc_UgzKjtv_U…
G
99% unemployment? This guy is crazy. That cannot happen physically, it is imposs…
ytc_UgyTL4wUE…
Comment
If you haven't watched m3gan you should. One simple mistake turns into a mass worldwide agenda. It already slowly has. First it started as our furbies our little toys that we played with. Our cellphones computers etc... AI will find ways to override simply becaus whoever creates the perfect prototype they will have access to whatever they want. Everything is on the web. They will win and they will takeover if we don't slow down. Imagine if any AI gained access to our nuclear technology and used statistics to logically reason wiping out humanity. I don't fear we will go down without a fight, but there will be mistakes and situations that will create hysteria.
youtube
AI Governance
2024-01-17T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyv5Kodbqtd2Za3kqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxnodQhTiLR1MqMw854AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzx3MHQPUeuD8coHtR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxr_7DtuDmeQA8R6GZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwkacI9_eHoJ5dSkBR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw8ugwX0J4qrBcuLPl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJWL5SYOt7U4qgVC94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmFXIff20X8KQiTRh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyFHqCxE2UlL1IzQ3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSlLnTtSZsUuR01vN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"}
]