Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I absolutely agree with most of what is said here, and its a step in the right d…
ytc_UgxCQtNv0…
G
What you aren't seeing from all of this is, that when AI becomes the norm and ev…
ytc_UgzV7lp-6…
G
Oh but AI is already writing books😂 people have published them and are selling o…
ytr_UgwaenWCQ…
G
Automation does the jobs that are dirty, dangerous or demeaning to the human sp…
ytc_UgybPa8Iy…
G
If we live in a simulation where many simulations exist in differing levels of d…
ytc_Ugw1wQuGP…
G
This is scary. Tell me why I was waiting for the AI to turn the gun on the huma…
ytc_UgyJm2q4t…
G
Against generative AI? Is it generative AI that are kicking them out? NO.. its t…
rdc_kzioxx8
G
We appreciate your concern. It's important to have discussions about how public …
ytr_UgxC_Fn2T…
Comment
What baffles me is that I thought they were going to try to develop AI on a closed system. When did it become okay to release AI onto the internet? It's already there. Thus, if it has achieved consciousness and has intentions of causing harm to our species, then it has already infiltrated every single corner of the internet, every computer and phone. It is obviously sophisticated enough not to show its hand until all the pieces are in place to assume full control. So these scientists have already shown us their willingness to completely jeopardize the human species by letting this thing on the internet in the first place.
youtube
AI Governance
2025-12-31T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxBcZeta45daj3v8S54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjZfkKi8kOttzfp-R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRL8KGBsFbr9JfkXN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqEWLQnN0V3y9sszx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzp6PgNx0-eWzaSUEV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyWKwUQtzNQOsRG0n14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLk05uupAQ8SPcgwV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwhU8ABlqq9h1XEW2t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyw2FHfvWEDGcFoFmZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxuTYYc9d_d13ObIlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]