Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, Matt Yglesias says the Bangladeshis should be happy to work in collapsing …
rdc_d3sgsoh
G
Okay one it takes people's jobs. Two people just hate AI in general. Three peopl…
ytr_UgyWs8tut…
G
He aint the brightest. A tractor existed in one industry. AI exist in every form…
ytc_Ugxi5z8nw…
G
Meanwhile Larry Ellison & Oracle is quietly getting access to everyone’s onl…
rdc_o87jnxu
G
Yeah I'm not saying prices are fixed, or that some instantaneous doubling occurr…
rdc_d7kszco
G
Dude, are you me? This is why i'm not applying for McDonald's right now.. i don'…
rdc_n0ov5b4
G
Wasted too many hours on this website lol
Don't lie, one of your AI characer wa…
ytc_UgyToTpVy…
G
We could get more work from home if not the fucking idiot new gens posting video…
ytc_Ugy-un1rF…
Comment
I'm a long-term hobbyist artist, -- and while I personally as a huge nerd and fascinated by the potential of ai in producing art I as an individual have no issue with my art being used in a data set -- I do agree that there are big companies making money from AI art generated from hard-working artists who do art for a living. I also think there are small developers and even colleges and stuff with legitimate reasons for the purpose of research and I personally don't have a problem with that. In *my personal opinion*, I agree that AI is absolutely here to stay, and there are ethical issues that could be difficult to potentially resolve, but at the bare minimum, they should absolutely be 100% free to the public.
youtube
Viral AI Reaction
2023-01-12T05:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwf9s3W73p6C25oeYR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZxHXKFcJgZls7h4t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZ84rP748kNiLKLQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"sadness"},
{"id":"ytc_UgwTmMAD7Voj_xQ8PeR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxu4pJ-ibpKlBLleZ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0WJQ9nbSNst_1c514AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzjkaKbVz9Z4OVgWOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9I5Op6WamoP157BR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyv91j5kFWrtK3I_gp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNNJNd-mSerTbG7qR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]