Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will create more jobs than it will destroy. Horse people said the same thing …
ytc_UgwvF4OSU…
G
This was actually a conference on the impact of AI ... im all for conspiracies n…
ytr_Ugy62lFX5…
G
1:25 I definitely think AI *can* be used that way. Simple text-to-image isn't ev…
ytc_Ugyr85iQp…
G
I understand your concerns! The balance between AI and human wisdom is definitel…
ytr_UgyYiFmdz…
G
Whenever my classmate sent a selfie or a picture especially me
My classmate rema…
ytc_UgyyESRfD…
G
I wasn’t able to get Nighthshade or Glaze, but instead found some AI Disturbance…
ytc_Ugxh7XjIW…
G
At least AI wont change the fact that they will want you to pay them in gift car…
ytc_Ugza2CPWs…
G
Neural networks can absolutely be randomly sampled from using randomly generated…
rdc_khzj5nz
Comment
I think humanity's greatest existential risk will be the time between the development of artificial general intelligence, that's still controlled by humans, and the development of sophisticated autonomous weapons up until the birth of artificial superintelligence.
If we are able to survive to that point, I think the superintelligence would likely logically choose to help us survive.
I really don't see a reason as to why it wouldn't.
youtube
AI Governance
2025-06-17T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxqNc2i5-uKafsJ9-N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtN-IjtBBpVv3Ugdl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwYbRWOFmbNh4QNTRB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTpecjewLSL1AAKGF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQJu5vk3tslmruuxd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzpi_qpkAmDy58YyUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCMSUvHIy_DloGWQ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyoeERaLSu-2gEwQjd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzAwVb-RmeMQLobX254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWM39ZgCVQSIPG2Sh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]