Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s not going to work for the super rich either, bc when the poor can’t buy the…
ytc_UgzaCkM6r…
G
i saw somewhere saying that ai can't be copyrighted because it's not created by …
ytc_UgwMbwKCj…
G
Would you be able to expand a little on the work you do without doxing yourself?…
rdc_m6xxalk
G
26 minutes just to say that, yes, AI is going to take every job, and that the on…
ytc_UgzZvflWZ…
G
Does ANYONE want our AI from Musk??? NOT ME. He has NO ethics or morals.…
ytc_UgyWmxzuH…
G
I like the technology, but the way its being used is messed up. Specially the wa…
ytc_Ugxkcjn3q…
G
The argument is always "but China". Honestly, a Chinese AI sounds safer than an …
ytc_UgxapkZFc…
G
This is a satanic agenda, designed to steal, kill and destroy. You would be a fo…
ytc_Ugy74opW5…
Comment
They are glossing over all the economic problems. They weren't quiting over the AI getting to powerful, they where quitting over bad business decision, no clear way to monetize there success, capital burn rates, and other business factors that would drive the company into bankruptcy by the end of this year. LLMs are showing diminishing returns and will not reach AGI without new breakthrough. It's not a engineering problem it's still a scientific one, and there is no guarantee of progress on it. It may 10-100 years before we see progress again.
youtube
AI Governance
2026-03-17T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxFBh7sICuefzof2kN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxTBPE9D4iYHNQevF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwzGmtADaJEY6k8qmd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOFfzGwQ6MfDj6MdJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwFprXAsM6XYxs3UQl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx9UBQQH62TXoM3hoV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuvbCVMFKVZiHmmA14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwbFeUalc0LjZuDAhF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNWIG2RB08sYWiEpt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyIP835IEjrJ4_2SXB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]