Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why did this problem not occur to the scientist as he was working on AI ? A bit …
ytc_UgzBbCXdJ…
G
Cortana is an AI that behaves like a human intelligence, in which it can learn w…
ytc_Ugynl1b_L…
G
The only possible argument for AI as an art tool is if the artist only uses AI t…
ytc_UgyoTgnzs…
G
The urinal and the banana are bad examples. Probably Bansky, Roy Lichtenstein, o…
ytc_UgwHz2-pb…
G
And that's just ONE industry. In 5 to 10 years we will all lose our jobs to Ai a…
ytc_UgxgiSVdh…
G
I want to design my own AI
So I wrote a scraper and I have 1'200'000 pictures fo…
ytc_Ugxw5VTHc…
G
Thank you for your comment! It sounds like you're looking for more control over …
ytr_UgwwAgZrR…
G
To be fair, there's nothing weird or wrong about studying nutrition. Dude was ju…
ytc_Ugz8QY02X…
Comment
Just like Frankenstein’s monster, AI is something humans created that can sometimes get out of control and have unintended consequences. While AI has amazing potential to help us, we also need to be careful and responsible with how we develop and use it.
youtube
AI Governance
2025-06-26T12:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywG31gtbUY98Bea994AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8fZ44_nGsxml-3dh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyHkclduB7OQwPGML14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4FHA5-nNc7eE1zsp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxGrWXGyu1b6iW2df54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxp3Y0oZU1MaZpj4mZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZ6WJsVALTIBT7lm54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3nVmo0ENsbUbKazZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXduJtE9gkYs7BUlF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx0zof2Tb4joW9WdM94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}
]