Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that, as always, things has to be used mindfully. I'm studying neuroscie…
ytc_UgwMu2zP_…
G
If people lose their jobs and they have no money, how are they going to buy what…
ytc_UgyBwqkpN…
G
RULES ON ALGORITHMS!? WHAT ABOUT WACKADO FREEDOM OF SPEECH / TIKTOK TWITTER TYPE…
ytc_Ugw6wW1-k…
G
And so what do you purpose? Make artists go through an exhausting legal process …
ytc_UgwPsUM-z…
G
Honestly, I think AI art is a good thing. If artists start using them too, then …
ytc_UgzSPzZ0Z…
G
This is both a person problem and an AI problem, and you rather dramatically dow…
ytc_Ugxqv2S8V…
G
Sorry, but robots will never have souls, and it is the soul that feels and loves…
ytc_Ugy8lwpt2…
G
Xq quieren q el mundo sea controlado x robots el hombre va a desaparecer q miedo…
ytc_UgxDRIrpy…
Comment
I agree that, in the end (which will be soon for the big AI companies), there will likely be just 1 or 2 major "AI" companies left. My money is on the OG: Google (as much as I hate to say it). Use cases like Alpha Fold (Nobel prize winning project) will probably outlast chatbots. The latter of which may get relegated to nostalgic novelties by the 2030s. Just my 2 cents.
youtube
AI Responsibility
2025-09-30T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwa2Kmr-yDoZ4RZwpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEVvTlbZDP31vEjiJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdxTJcMlE44fYFh014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLMvWr6EkdXfh55lZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKLIoTIZupR-digid4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx99WGi7UEpXHaeDwF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzBnJ2XE4geSVJqZtF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyTFNb0SUHmJMG60nN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoDM_JGyYAuiK3KWx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHUJT_8ejOURstH2R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]