Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i expect better from Bernie than repeating what CEOs who profit by fake boosts o…
ytc_UgyN5eDHk…
G
The reason in my opinion that AI supporters are spewing, AI is going to take job…
ytc_Ugz3AineM…
G
15:32 THIS!It enrages me when people say "ai ia teh only way disabled people can…
ytc_UgyxhF6YW…
G
Generative AI has no good reason to exist, it was made to replace us, that’s it…
ytc_UgylwY5ei…
G
But, but.... if millions upon millions have no work... what will they do? And …
ytc_UgwIdJeK5…
G
As an artist i support ai art in... specific concepts. You see some people code …
ytc_UgwNnQo72…
G
Tesla Autopilot has been involved in 467 crashes, 13 fatal. Meanwhile 16,248 car…
ytc_UgzNVC1yd…
G
This is disingenuous or naive. A.I. won’t replace developers/engineers it will r…
ytc_Ugz37s4NQ…
Comment
Fascinating discussion but for me, ultimately a frustrating one. I feel like the questions should be “what prevents a sufficiently advanced AI (i.e. one that is smarter than humans across the board) from wiping out humanity”, “what, if anything, could be done to prevent it?” and “how long have we got?”. Based on 40 years of working with computers I would guess the answers are “nothing” “likely not very much” and “maybe sooner than we’d think”but I’d love to hear Professor Wolfram’s answers to these questions
youtube
AI Governance
2024-12-05T11:1…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwhetJyaa7zaqwDeDN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugyk-OKW3TGozQB_eBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzX7RB92cDYXHhwK_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKpMq6JhlIFF28Tex4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2JBNpRY8BevvEIeN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyrO0-H4ZqH4OnPqH94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzMsmvWvck0BqRPNx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyeUBmE_5VG4ux2gSp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz2FpwtTbLpyQH34Kl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxIY763xz6KMKWDgDl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]