Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should start to not using the word AI for slop generative AI. I am a bioinfor…
ytc_UgxE5c0nu…
G
As a not good English speaker or writer, ChatGPT and Undetectable AI really usef…
ytc_Ugwhx69fF…
G
So when mice get in there and chew up wiring then what, water, heat, deteriorati…
ytc_UgylkXs89…
G
is it just me or why do i feel like this video is made with ai…
ytc_Ugzk8diSS…
G
If i talk to ai like a pirate, will it respond as a pirate but with the relevant…
ytc_Ugxi5FzNr…
G
I love to draw, (which im pretty good at, my friends and others i share it with …
ytc_UgxrDPi1A…
G
200% correct if AI replaces everyone then who will consume the product developed…
ytr_Ugxien8GB…
G
Here's another point, higher intelligence/productivity doesn't always mean highe…
ytc_UgzC4IBTo…
Comment
6:23
I have reservations about the autopilot still. I don’t think that the tech is there yet. Anyway driving auto with a 6 month old kid? No way. Horrible idea. It’s one thing to risk yourself, but a baby? Yes, there’s risk with driving, period. But while humans have our shortcomings, but we can improvise better than the AI still. Think about other people running stop signs, etc. Humans are better at reacting.
My car has the adaptive cruise control, which isn’t the same as the Tesla AI, but I’m extremely vigilant when I use it. Anyway, don’t risk AI with a baby.
youtube
AI Harm Incident
2023-06-08T15:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwu90-aJBcfwMCOhlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzXZp1le5oB2aTaqcZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxhyXG8ZF9ZrAMriGJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxgXPRMR3nn-GTTB7R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyq9U-AKTBvDEVNr2J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxjOHd4N3USuWI4MpJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz_ESyqi3BaEtRe_ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzsDkeIoiSNTbDO-mt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwHVa8izwwlppgKR_B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwiBmdRgmfTezehyfl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]