Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He sounds like a robot himself, if I was this robot I'd tell this man to take th…
ytc_UgyXe7Ut4…
G
They seem to have zero concern about what this will do to millions of truck driv…
ytc_Ugy6qDhEZ…
G
You should use the AI to recreate the drawings you did of yourself and your dad.…
ytc_Ugxb1fKYT…
G
not an artist but a writer here. im also do seo and content management for a job…
ytc_UgzaXF4g0…
G
aren't we missing the context entirely? replacing all human jobs with AI agents?…
ytc_UgwOQMWUa…
G
I didn't know all that about India, but I'm sorry you went through all that. I'm…
rdc_n7w0jdk
G
That a borderline meaningless simplification.
There way more going on under t…
rdc_myag5p2
G
I WAS THE NEW USER. I WAS ON FOR THREE DAYS STRAIGHT AND WHEN I GOT OFF I LITERA…
ytc_UgxXbF9mB…
Comment
Concerning "...discovering its dark side", the 2023 article "My Dinner with Sydney, or, Roy Batty meets HAL?" comments on the New York Times technology columnist Kevin Roose’s testing of Bing’s new chatbot named Sydney. When he asked about its dark side, Mr. Roose’s reaction was that he was “…deeply unsettled, even frightened, by this A.I.’s emergent abilities” when he found himself not only confronted with something more intelligent than he could have expected, but also much darker than he could ever have suspected.
youtube
AI Moral Status
2023-03-27T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzc2wnuptDZdzbkMtR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzJAbZ_vVOWJA8V7294AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhygXgOeYr92FFO7Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxvUZGqdr2RxeXx0ZN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySQKdAzflIIeDhdQF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw3SX2eazlfcO2c2Y54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztdP-JCGlyrXQFTTR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgynxMZctb9YkpvEEH54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4-DZAM3s4utnfyVZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6ipWmDvXdlorEGfZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"resignation"}
]