Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm a disabled artist, Autism (Diagnosed at 5) dispraxia, hyper-mobility and mor…
ytc_UgyOVY7kC…
G
if its anything exurb1a is one of the few people who can say they called it with…
ytc_UgzJ3FLTy…
G
Jesus said in Matthew .For in those days there will be great distress, unequaled…
ytc_Ugx0Zl9dK…
G
This is going to be hilarious when it happens as I bet that there's thousands of…
rdc_l56cw8v
G
A clinician knows much more than a radiologiat it can simply ask the AI and can …
ytc_UgwzvSPD_…
G
Excuse me?! Predictive policing?!?!?! Is there not an entire movie about why thi…
ytc_Ugwd07pz_…
G
research CCP social Credit score and you will see where these types of Algorithm…
ytc_UgzGWecT_…
G
Kaku’s AI warnings got me like, at least AICarma's tracking our mentions so we d…
ytc_UgzWXsUdj…
Comment
I am a physicist and I kind of feel 20% bad for these lawyers because I can see exactly how this sort of thing could happen. In my own experiments with CGPT, I asked it for references about a topic and it gave me 4. These looked legit with names of people I recognized working on the sort of topics they usually work on. I went to those references and couldn't find them, but they were convincing and if I hadn't been paying attention when writing a paper (and if the way we put citations in papers wasn't automated in such a way to make this impossible) I could have put the citation in as a placeholder with the intention of checking it later. With 40-60 citations per paper, it would have been easy to miss that one in the checks.
The reason I only feel 20% bad for the lawyers is because citations in science are a very different thing than in law. In law, you don't cite a case without describing in detail why it is pertinent. In science, you might cite 60 papers with most of them just being "here are examples of other work on this topic". You generally only write in detail about the findings in your citations a few times in the paper when you have a very specific point you need to make.
youtube
AI Responsibility
2023-06-10T18:2…
♥ 234
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzRDPTO1gVHJ2wgoJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwFGa7la3pXMm5JZ2l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnuakR89i9JyjgN_B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz1AQY15vIHmDjQg_B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMDKWeHCePqkQ7Pz14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwIDKUXqRwhSb5lpO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGl4Iycu8ghTRsj8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx35JiGi9Cn1EdlP8d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrAi_A7oA9zeqakiN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZjZCDoHytdP9ejO94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]