Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right now, we’re entering a period where AI has partial autonomy—it can perform …
ytc_UgwL7_xXr…
G
For the past few years they have constantly stated that driverless trucks are no…
ytc_UgxQBnt4X…
G
There is also a concern about the creativity in the art work. Imagine if people …
ytc_UgzRrDASm…
G
Build your own energy infra then, and make sure it doesn't use any water or belc…
rdc_lp6q7ll
G
Legislatively, "...human judgment..." is the referenced authority. The Bible sa…
ytc_UgzZq-ako…
G
Yes but in the end they still have SOME plan. Abstract art like that is still so…
ytr_UgzgOSQj5…
G
First Law
A robot may not injure a human being or, through inaction, allow a hu…
ytc_Ugwtp0aTB…
G
Elaborate, she's an engineer who designed the software and multiple coworkers vo…
ytr_Ugx-9ec2b…
Comment
This was one of the weak points for me as well. I saw the proof-of-work blockchain as a wasteful enterprise because crypto mining was so energy intensive compared to the value it was generating, especially compared to conventional payment systems.
LLMs might be very costly to train, but that only happens once, and the cost of that training is spread across all the billions of times it is used to generate an enormous variety of useful things, far more useful than just "jokes". If an LLM is used to replace a human at a job, what is the total carbon cost of raising that human and keeping them alive, just so that they could read a PDF and answer some questions? That's the real comparison. Seems like a very reasonable tradeoff to me.
youtube
AI Responsibility
2023-11-06T12:5…
♥ 59
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugyg82mTX_D-Hay4UlV4AaABAg.9wm9MmyKkTh9wmRponcdGI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyg82mTX_D-Hay4UlV4AaABAg.9wm9MmyKkThA-Fbut9QgPr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyvAuyoxsti5Znfu994AaABAg.9wm86OHQc7m9wmr8WVRYQW","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgyvAuyoxsti5Znfu994AaABAg.9wm86OHQc7m9wnDBLybBgW","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw4nBsUJVAVu0YW2zx4AaABAg.9wm4qB3hP-U9wnipWYcY0M","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw4nBsUJVAVu0YW2zx4AaABAg.9wm4qB3hP-U9wwwNdPq-8h","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw1urvKyb3Es0isIhJ4AaABAg.9wm1oY8G2kV9wnF70lJAyF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzRi1Q9lbVf74JFdhV4AaABAg.9wm0ylfHQHSA65ygAMUJiQ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwgkiNLF6XFYw49aZF4AaABAg.9wm0wFg_WW79wm2phalCeF","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwgkiNLF6XFYw49aZF4AaABAg.9wm0wFg_WW79ww0gUoLl14","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]