Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
God, you people...
*You are currently using AI.* Reddit uses…
rdc_nnt86px
G
21:51 looking closely at the fingers on the people in that courtroom sketch to m…
ytc_UgwHdEs5b…
G
People aren't worried Ais going to enslave humanity they are worried ai will tak…
ytc_UgyY31PaC…
G
The extinction probability will never be as low as you'd like because an intelli…
ytc_Ugz4pH8q5…
G
It looked A.I. to me. Look at the seperation between the upper lip and front tee…
ytc_Ugyyd-ZeP…
G
I was thinking the job market will be more oriented for manual labor since it is…
ytc_Ugzf8qUDX…
G
Erm no it looks like a robot doesn't look like a human at all . Skin doesn't loo…
ytc_UgzHXygmK…
G
Why do people even attack AI artists? It doesn’t matter if they drew it or not, …
ytc_UgxS2YXqY…
Comment
I think the biggest problem is that you have read so much of the BS and optimistic crap about AI that you think its actually intelligent. It's not. Language models are just emulators. They don't actually understand anything they literally just generate statistical answers from everything they have read on the internet... and they have ALREADY scrubbed the entire internet...
If they actually understood anything there would be no hallucinations / no kids committing suicide because of AI.
In the future maybe... but what they are doing now isn't even close now to AI and they are just hyping it to the point of insanity.
At best it's a tool to sift through large amounts of data.
youtube
2026-04-26T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzVkF_NrLhkrEUVVVJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLowtyS4K1gwcVkiR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJFDUpw1XGigpJ7mN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxRXlHNABr7L1lv52p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyeLGPbuS6GBEUl8NR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMBadBsMgyzTfMzXh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxlfB3eUxRSUIynTop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFI5kcBLGmjQUDv5N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxfzkj5jcopo-Urf754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzY4WLqaKcu4xBIIfJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]