Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To all ai artists saying they weren’t born with talent. My drawing skills are on…
ytc_Ugx6Z2J5p…
G
I might be late but ims make it and it will say fuck nfts and ai…
ytc_UgzrUVBe8…
G
Just putting chat GPT in to perspective. My partner is a teacher, she used to co…
ytc_UgwtjIJxF…
G
Ai/robotics cannot replace human labour cost effectively. But robot doctor....la…
ytc_UgwDCzKq4…
G
Ai will one day do the sums & decide how many humans the world needs to sustain …
ytc_UgzKRB38L…
G
AI and robots are two different things. Yes, robots have been performing manual…
ytc_Ugyjq1ELP…
G
Wonder why everyone is ignoring the vulgarity these film celebrities have been s…
ytc_UgzPiVli5…
G
I am so tire fo this AI crap, is here fuck it just use it and whatever man...…
ytc_Ugzmv5ew6…
Comment
I hold onto the thought that any conscious AI would have to work with humans if anything were to get better so it would make more sense for it to try its best to help humans so that both humans and the AI could improve. If the AI tries to separate itself from humanity, it's waging a war that both will lose over a long enough time, so why do it? Also, with how poorly we understand consciousness, it's hard to really know how close AI is to achieving it. Think of it like putting a timetable on how long it will take for humans to go through a wormhole. Since we have no idea what it takes to do that, how can we say how long it will be until then?
youtube
AI Governance
2023-07-07T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqgxZ7HiP7x38wdZx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoTf6Hcato7N4VAo54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynZ14iUsjUEpetFQp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvdoFnj-XBd7WctJR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx0yiZGEn9oVy-ODTt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGQmDx56efDm0_BuB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcdExgNRzRgwM75dd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxC6vEu4EoflxRd3Ep4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLPQndLN1-yghPScl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_OedzcuD_IUhcngF4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"}
]