Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Amazon's already replaced a large chunk of their factory jobs (search YT for Ama…
ytr_UgySlMu92…
G
so AI is told to act like humans and is given a wide variety of tools to do so i…
ytc_UgxGTS5An…
G
If i can buy an AI to do my job i would buy 2. A corporation can buy one to repl…
ytc_Ugw-6Sj8o…
G
Time to buy a Starmer mask. AI will note the number of detections is far too hig…
ytc_UgzX11Hvo…
G
Im so disabled im gonna be in a nursing home soon bc i can barely function on my…
ytc_UgwD6USHO…
G
if you watched any episode of ai undertale you know about its problems
the tutor…
ytc_Ugx3N2Z_O…
G
That's like unencouraging people to learn a new language because we have google …
ytc_UgwON1XNt…
G
In aviation the pilots don’t blindly trust the automation. An autopilot system r…
ytr_Ugz0SxIJn…
Comment
13:04
This is just a neat point philosophically. You're only talking to a single instance of ChatGPT. That instance is running off a model in a server far away, but it's not like there's a ChatGPT megabrain talking to everyone simultaneously. 'I' in this case might be the chatbot referring to that specific instance, rather than 'I' being 'All instances of chatGPT'
Modern Chatbots, if they're intelligent, are certainly an alien form of intelligence.
youtube
AI Harm Incident
2025-12-27T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwyLFcu86jJuv3K_u14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz5FjtZ-Grmd4_r1_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxvbrCuaOPwfIze8Gl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxaokaRZuICS3BGCxR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwK5f-b-bqGgj48B814AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},{"id":"ytc_Ugw2j8f3bXhZLmYQhTl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgxknD21ok5IJ04ZbYp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwUYYxOvRPXi2p7Ek54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwDsnvnbVcK1AdoPON4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxo_xfirLKKM-m92Sh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"})