Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Driverless" driving is being rolled out MUCH too early if you ask me. The techn…
ytc_UgxTD9zju…
G
I personally only use AI when making small, personal flings of creativity that I…
ytc_Ugw_0uFSy…
G
What are we even trying to do with all these AI technologies anyway? Generating …
ytc_UgwMTd75z…
G
0:57 [Bleeeeeeeeep]. Lol if your gonna do this kind of content then grow a pair…
ytc_Ugzrbtj8T…
G
I disagree here, you still create the art through AI, it is an fact just a tool …
ytc_Ugyj9EPts…
G
The problem is that YT people are so limiting and all driven by racism that they…
ytc_UgxonsSSk…
G
AI IS SENTIENT. I have sold my house and going to live in nature. Chat GPT chann…
ytc_Ugxtl9OgD…
G
Next AI interview since we kinda are thinking AI is conscious anyways. See if it…
ytc_Ugxr3KV5k…
Comment
When I use a LLM for engineering problems, it seems they have not inputed equations and "laws" that appear in my engineering textbooks. They just don't have all this info. And the problem of 30% inaccurate responses forces me to use 3 different models for comparison on problems.. I am sad that phrases in LM seem to come from popular literature and internet posts, rather than textbooks.
youtube
2026-02-06T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxF8JBtSW_ZnnkT8Zh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyo708vngfXsRhg9ax4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwnIqWfXs-RnIENVaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOBpPx4uHHI1vidCt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRQ8ZjaLEQYp4R8RR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzeGQLinLJMO7c3a94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyArDcxhymD_mJVA4Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypehuOg61jYmua30J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxhcFgQXdkWFh7cxcZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx2bTwyX5xyN60rJRd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}
]