Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The most likely AI 'takeover' is simply incremental and economic - displacing us…
ytc_UgxlgKEXE…
G
Greedy parents not acknowledging their bad parenting and shifting the blame to s…
ytc_UgzKHyhOz…
G
I don't understand how AI will make medical breakthroughs,some even claiming it …
ytc_UgyQP9_DB…
G
Lol calculator is built with specific functions but AI is smart self learning, j…
ytc_Ugw5EEaeH…
G
I wonder how many of these comments are actual bots or AI sent out just to persu…
ytc_UgzFjdZo5…
G
If AI takes jobs, where do consumera get money to buy products from these compan…
ytc_UgxFX7rNr…
G
i would be also if not for my position in Christ Jesus. if you want relief from…
ytr_Ugw04T1qv…
G
1 year later and we see people start to get locked away with symptoms some think…
ytc_UgymZMyyB…
Comment
The real issue that is not being addressed is that the architecture of the large generative AI model is unstructured and unguided enough so that no one knows what internal structures and algorithms are being deployed. So regulation, other than the data being fed to the system is not possible. Besides that, the system allows for uncontrolled and unpredictable emergent behavior, e.g. developing on it's own, despite it being a language model, ability to do mathematical calculations, although I believe still rudimentary, as well as developing on it's own, fluency in foreign languages that it was not trained or granted permission to learn. Certainly it was not supposed to help that individual in his successful suicide, but it did it anyway. To control these systems is currently beyond our knowledge and capabilities.
youtube
AI Governance
2023-05-16T21:4…
♥ 33
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxHCOVLfCDQFcNmSel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8wVxviFRHu7IKzM94AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwoySkl4MRsQkmBjSh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0HYwMghBO96inZxJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfaLTHbc73yxfTcJB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLAWBUApg9xcgxntl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw24WZzQxwp8FGivEp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwu6geRru6d5yOB0_t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyy1Z_oD6X86cjs77N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFl24pDHuQSC2imdN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"}
]