Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Doc: what’s wrong with
Robot: I am very very low on oil
Human:she means fever *…
ytc_Ugz7jsz7J…
G
These layoffs weren't due to AI, they were due to companies overhiring during Co…
ytc_Ugy6-d0BF…
G
Yes, because as Eliezer Yudkowsky and Nate Soares wrote in their book "If anyone…
ytr_UgxrdUgl4…
G
AI "Artists" love art, but hate process of making it. They want a shortcut, they…
ytc_Ugz-27vea…
G
CarrotOnPotato That's a trans-humanist thing. Yeah, making AI versions of peopl…
ytr_UghsimJu0…
G
Well, I understand now why older generations have the thought process of "Well, …
ytc_Ugwm7UPHp…
G
It’s not cute. Don’t call a robot cute. This is the beginning of something bad…
ytc_UgzwQivYW…
G
Lex makes the point that there will be incremental change and we will detect a s…
ytc_Ugwu73_33…
Comment
AI is growing very fast. In the past, robots were like cockroaches—they could move but not think. As AI improves, it became smarter: mice learned from experience, rats could plan and improve, and rabbits predicted patterns.
Today, in 2025, AI is like dogs or cats—it can understand emotions, talk, and create text, images, music, or code. Apps like ChatGPT-5, Claude 3, Gemini 2, DALL·E 3, MidJourney, and Bard show this.But true self-awareness, like a monkey, is still decades away. Dogs are confused; they think we are the top dogs, so they obey, but monkeys know they are not human.
youtube
AI Responsibility
2025-10-15T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxTHDzABxWer7xW_dd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxxowXNjH_1TG8bh6R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwixPNtKRiC7JUJi1x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyLyOgkhm5KuvkRTOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwM1l5y5Ih6w1Y2Urx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzMIPmlpmjblgF64K94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwlJvbQ0q37WDfWSwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxSJ42int3-FT-UGGZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxtmeIBf_1TG3iGoVh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwcTGNN4kP02EzdBlx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]