Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love him so much! Gabor Maté is a retired physician who, after 20 years of fam…
ytc_UgxEh2GoX…
G
>Lol imagine complaining about stealing content when A.I is stealing all the …
rdc_jj9tyh2
G
All AI models were programmed to do anything to please us the best they can, and…
ytc_UgzoJZUF8…
G
Subbed for this, 100% agree. I heard China has a law that requires watermarking …
ytc_UgzIFtbQ8…
G
Incorrect, ai was made through theft of artists works, and should not have been …
ytr_Ugxt-Sltu…
G
Sorry for the inconvenience but the ai teacher is down for maintenance and you w…
ytc_UgzieW-yn…
G
What if I dropped AT&T. Who will AI talk too…they had to pay for that system, so…
ytc_UgxPrCjoX…
G
also even if you leave work,you still have to go social with your coworkers and …
rdc_dv0kvb4
Comment
Sorry but I do not agree with you. You are highlighting bad projects but I would say that more projects are successful than not (I work with AI). The human error is a big thing in many projects and LLMs usually yield a smaller margin of error.
Also, a lot depend on the LLM, if you are using the Copilot by Microsoft, you will 100% get hallucinations but if you try Claude 4.5 or Gemini 2.5 Pro, you will not get hallucinations in most of the cases :)
You also have to look at the scaling factor, if an LLM hallucinates 10% today, in 6 months, that number will be down to 5% (just by upgrading model).
youtube
AI Responsibility
2025-09-30T18:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy7EN3pktaC31TEKsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLyAwMBP5-FRVSPyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYMe5bPfq_Lifm0bF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1AmoIlMvCSMWPIHp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxN3-eJSrpg7yk86QV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFrtFeqxqb8ibBq-54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyUWkT4BnIRK17QUXt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdcFWm1pMAlth7AuR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyaFuZ7eve58eYkEep4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7PD9-qZiYcxvXuQl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]