Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it sucks that by next year ai will have easily adapted to poisoning bringing us …
ytc_UgwPRvM48…
G
i had to delete my pintrest account as i started seeing my art being put into Ai…
ytc_UgzbghOXX…
G
BTW, ChatGPT is a generalist, that's why it still is not as knowledgeable about …
ytc_UgwMiAO4-…
G
There is no way to create super intelligent AI, and maintain capitalism. This is…
ytc_Ugylzdhgz…
G
Yep, absolutely. At the undergrad level, AI is getting more degrees than people …
rdc_maj1qq0
G
Hot take: AI has helped me learn a TONNE of stuff much faster than I would have …
ytc_Ugx6y18AQ…
G
What your were explaining about Echo and using another AI to reach the outer int…
ytc_Ugw5kT3-Q…
G
I already coded ai rulesets that keep it aligned with human values.
And for to…
ytc_Ugz8XqvHy…
Comment
when the greatest minds in the world can't agree where AI will lead, it quickly exposes the weakness of human intelligence. You merely need to look at the history of mankind, nothing but greed fueled wars over land, resources, and control. Now that the genie is out of the bottle it's only a matter of time until we reach yet another inevitable conclusion. If I were a betting man I would put my money on blackrock. seems mr. fink will soon own AI like he does everything else in the world. I think I just called the next bond film!!!!
youtube
AI Governance
2024-01-04T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJnXmJnUjI4BZw5cd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwpYg_nreZwNpknQop4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzjYp7Oaf7u7i6xvml4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzm3XB7nXqrSR1ypFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTPRPTwe106_KghyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyNfYpULfh_Ir9Ix1R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwO_VzN-pF3q4Py3c54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwkgvHDi4UVDVYQ8Ml4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxm3y0DqVd6ftgejit4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD2Yiu9gI7Z8nwxY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]