Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
SoFi bank, Google Fiber, Comcast, tmobile, Verizon, Amazon, Walmart, target, AT&…
ytc_UgwEUntoe…
G
Honestly as long as they aren't convicting people with this as evidence I'm 100%…
rdc_e1tvb3c
G
I’m a machine learning engineer and researcher. Very well explained. Tip to the …
ytc_UgxX-Ayn7…
G
Interesting
... How can we KNOW, for certain, whether this is indeed an actual …
ytc_UgwcIePda…
G
The issue here is as old as the scriptures and as clear as the constitution of t…
ytc_UgwyuMJKs…
G
Nice video on the status of Autonomous driving! I was just hoping to hear someth…
ytc_UgzjB8oHr…
G
So what I think will happen will either:
1. Some Detroit Become Human type shi…
ytc_UgzmDjqU5…
G
Asking what the difference is between a LLM and the way humans learn things is a…
rdc_jmuhvc3
Comment
The problem is that if AI stays like he describes it, without creativity, then no one would want it writing scripts. It BECOMES dangerous once it gets good and creative. We’re machines just like computers, and sometime SOON, AI will be an extremely competitive writer. If he didn’t believe that, he wouldn’t be so worried, and we should be. Because it will become a problem. But it’s not gonna go away. Instead of a team of writers, it’ll soon be two writers and an AI. I’m a scientist, and I use AI all the time to assist my experiments. Soon, it’ll be harder for me to get a job too because less people will be able to do more with less.
youtube
AI Jobs
2023-08-01T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzHpCA6FswYfv0g-bR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4AU6P40VksfWj_fB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzaw4SxN9MkbF-MLLl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxcMyxaW4I8LJP--sZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxxfdAs6VDAb1vEvAl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxLci1GXkp9usF0rwF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx8P3h5EoL7lSHjVb54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4niYGNRjtDFwdKCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZ-aBerpomm4YdzuV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1s3GqoFS3l4MdOC54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]