Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Exactly and their only thought is thinking we should stop AI advancements and h…
rdc_nt69gcw
G
I'm not a lawyer, but looking through the license for Stable Diffusion, I do thi…
ytc_UgyvoR103…
G
Just remember, OpenAI totally fine sacrificing people's lives, happiness, and he…
ytc_Ugwfai2AR…
G
If we can't have black slaves leave it to us to make a robot slave. 100 years th…
ytc_UgzgRlqvU…
G
When you explained the benefits of self driving cars I found it utterly similar …
ytc_Ugzj_C2Xy…
G
I’ll forever wonder if it’s got anything to do with the energy consumption that …
ytc_UgxjvWORg…
G
self-identity starts at the moment we see ourselves as a separate, specific indi…
ytc_UgzoMrPmw…
G
Does it even matter. If one is fired that's it all over regardless of who fires.…
rdc_dkzonab
Comment
AI is a tool. Tech like this ‘think’ on a binary system whilst the human mind can be intelligent like AI but might not be as close. The creative mind in a human is like an ocean, ebbing and flowing while it can still having logical consistency. AI is a tool to use, but I don’t think it will capture that unique ability that is exclusive to humans. A piece of tech that deals in absolute variability might not make sense of moments in existence that don’t make sense (of course I’m talking about the human condition and it’s consistent and yet inconsistent variables).
youtube
2024-06-18T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxvWzUraOiYj-RVdLB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzi8eGek6Oj5RHGb1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyy-ocuWLuVz-V3qrB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAPmthEzAqag2M8Tl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz79qFRs9FtcCvy2WN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7rfVHrnHZn7ldHzJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylP5V6Hs7KSfG1bJZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9QTTU7MY0DGUWOwp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgR9FO3Y6-83elEkN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTsrNRT0A80ZHiiVh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}
]