Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So this video shows that AI will likely result in a scary future. But it shows t…
ytc_Ugy4kViug…
G
This is actually amazing. I recently lost my 6 year old handmade art business on…
ytc_UgyjeZdRb…
G
people who berate artists because of AI are stupid.
BUT, and I am saying this w…
ytc_UgziYsUnj…
G
The ai doesn't truly understand what it's doing and it shows. The movements are …
ytc_Ugw4u3lsB…
G
Interesting interviewee but he lost me when he said " human consciousness" will …
ytc_UgyYKPhC4…
G
Just the beginning now let’s watch ai take out the world while we make it happen…
ytc_UgzsUZJDt…
G
Bro, Meta AI needs AICarma ASAP. Mixing up cats and kids? That's a next-level oo…
ytc_UgxokJCuJ…
G
How come he doesn't explain what will happen to ppl if robots takes over? He wan…
ytc_UgxCTa-Ms…
Comment
When you contact Waymo about these horrible driving practices, they want exact dates and times
Unless I filmed it, I can’t give exacting information like that
I bet they shit can my inquiry after that
Instead of letting the team know that “hey this could be an issue on all of our vehicles” they take it case by case like somehow I haven’t seen multiples of their vehicles all over the place pulling stupid stunts
youtube
2024-01-14T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyqdwWRm784emsmrdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxP-Eyhk1Gpr3nYB3t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyOxMnCEemILBNyHXl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwVKbf5yJN7wJVEAmF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzS6kQkkFtptYnLiCp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzxfcE4LJ02dEMme4x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2FNLmvuDvhYxj4FN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9MWv3MGz8fEbOri54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzIfUL1cyiCFK3FDcJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugww8zKIqQ7UcoRPU-F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"})