Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
art isn't about just talent, it's about skill and the time you put into developi…
ytc_UgyOj4A1Q…
G
That's indeed the tricky part, Kendrick! Accountability can't just fall on the i…
ytr_UgyN7gjwl…
G
We already have 'AI' existing as the imortal 'being'. It exists and fails to ho…
ytc_Ugw6ig-tZ…
G
It’s killing AI Art. AI ART… not any other AI programs.. Jesus.. You’ve got a br…
ytr_Ugxl4p20H…
G
@bombtubejamz739
It's possible for AI to be harmful if they become advanced eno…
ytr_UgxeIzOyF…
G
He got something half-wrong at the start: the name is not wrong. It's just that …
ytc_UgxENx4v8…
G
To add to this, if you ask ChatGPT to do a school programming assignment it is _…
rdc_jad5f28
G
well you can use this tech as a serious tool but only to just mess around and or…
ytc_UgykYCq7_…
Comment
I’m a bit of a fatalist when it comes to these kind of things but I’ll parallel the invention of AI to the invention of the nuclear weapon. Now we as human must live with the fear of nuclear Armageddon at the hands of governments we do not trust and by mechanisms to powerful to derail. I think most of us would largely agree that perhaps that genie was better left in the bottle. But if not us then surely someone else right? Given the progress of technological advancements at that time it makes sense that someone would develop a nuclear weapon. If we should refrain from inventing this technology will others refrain as well? We did not refrain, in fact we raced to develop them. We are ABSOLUTELY headed towards a future where we look back and say “Had we only know then what we know now”.
youtube
AI Governance
2023-05-17T10:2…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy7077bntBMstBT1R94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGZ-mM2OcnMSWDPpt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwI1q-het8yxTQn-Yl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyk9JPqd43xFjrh26d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyptKh4c5yfbWW6SbB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzqo3vyetWpEbt4TNF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy69TkwGtyotvTzy_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6U5F2bS0YoLThD7J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNilJdNQWMbrFcsTN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzzQ-JIgxOU4gsnN4l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]