Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont want my future doctor to be smn who got answers from ChatGPT.
Mind yall i…
ytc_UgyUKIfK5…
G
Some people are worried about A.I thinking for itself because the consequences c…
ytc_UgxEl97Ur…
G
Artificial intelligence is a dangerous weapon if created under world military in…
ytc_UgwKUBBcG…
G
Robots, AI, self-driving cars, digital currency, ETC.
ARE WE GOING TO REBEL YET …
ytc_UgytGL-W8…
G
The thing they always seem to forget is that art is basically the human experien…
ytc_Ugxs-ASO3…
G
I'm so glad you are taking this stance. As an indi author and editor, generative…
ytc_Ugyxlc1qh…
G
It’s hard to say if AI doesn’t actually lead to greater efficiency when all the …
ytc_UgzSBfIPu…
G
I dont really mind self driving cars, but I really hate the idea of remote contr…
ytc_Ugx4mLw6P…
Comment
Ridiculous to worry that AI will wipe out humanity - not too belittle real risks - when the technology doesn't really work and produces slop, while climate change is a real existential threat, and the huge energy demand for data centres contributes to emissions. AI is a huge distraction from the real threat in that sense.
youtube
AI Jobs
2026-02-17T20:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxrE-_UgguzgDiaZRZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYNXmT-wkkrQORXzR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCzZy3BLsKO3N63WN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3rD3_r6qV7tmnQi54AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwGMt0F7MnWYMyll0F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzErSmKgBriJn-3IZB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzBhfYpFzvZy4SDvmx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyMMVYzC8outb1z_Zt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_PVHC8a6vEvd_WKJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJlgw-kg37nTIFXzJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]