Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How come he doesn't explain what will happen to ppl if robots takes over? He wan…
ytc_UgxCTa-Ms…
G
You know how it's done. Sometimes I wonder about sharing that kind of knowledge …
rdc_jga3kma
G
Ai generated images cannot be considered art because theres no accidents to it. …
ytc_UgyzC5ipN…
G
Okay, but ChatGPT told Alex that it had fun talking to him. Having fun involves …
ytc_UgwQ3KqP5…
G
I say as a developer, we are the first being laid off and automated out of work.…
ytr_Ugwj4XnyF…
G
I followed him as he stuck to weapon, armor and gaming videos. The moment he sta…
ytc_UgyzWhvnA…
G
My entire skillset is in the creative space, so I'm directly under the crosshair…
rdc_jf8hj08
G
This is a play to try and corner the market for himself. The compliance laws he …
ytc_UgxiztUdI…
Comment
if only agi didn't also include the fact that that its smarter in every way, meaning deception and manipulation may be way to easy for it in the future. I don't think we will get to terminator level but i do think and agree with the ceo of conjecture that we may just slowly lose control until ai has all the power. It wont be a fight or some crazy war just a silent death of all rights and control.
youtube
AI Governance
2024-02-18T17:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAh9FaV6xTGYqHtwV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzscWgx0DH0Xy-NUTp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2Da1a0oSmHli4jYN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy3JiCMrSg83ASF_194AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9Cux6UaCOig6rnBd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCtnKNzEB2n15bf-x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyN8UPVHoZtmRMiydt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyAK70iDb2NwIrn7kx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyrCrKf6Wf9J32hSLl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwckGDFTI9MPe3jyXV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]