Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your comment. On the AITube channel, we use advanced artificial in…
ytr_UgyCcHNfM…
G
0:03 starting with a false premise: no proof whatsoever that AI is causing any l…
ytc_UgztwA_wz…
G
The Eews feel they are superior. They have successfully turned the rest against …
ytc_Ugxct5lR0…
G
As a young man I received an education in shoveling dirt. I learned how to be ef…
ytc_UgwlOnX4U…
G
Some manual jobs are much harder to replace with AI and automation than white co…
ytc_UgyN5X1H7…
G
We wanted a computer to dance around a question? Is that why AI was invented? …
ytc_Ugw-xzYRm…
G
If all cars were self-driving, it would save countless lives, and someday that w…
ytc_Ugxa6LHN3…
G
What next, self driving cars being more likely to run over black people than whi…
ytc_UgyVMh7-i…
Comment
Let's just list the facts here:
1. Motive: Balaji = star witness in $B NYT copyright suit → OpenAI bankruptcy risk.
2. Timing: Exposed Oct 2024 → dead Nov 26.
3. Wig in blood, including hairs pulled from the wig: Not his → killer’s disguise fell off in struggle.
4. Cut elevator wires: Premeditated cam sabotage day-of.
5. 2 wounds + back-entry shot: Impossible self-inflict (private autopsy).
6. No GSR on hands: He didn’t fire the gun. (there are very few to no suicide cases in the history of humanity without GSR on hands...)
7. GHB spike: Drugged via DoorDash meal → subdued, staged.
8. Deadbolt trick: Exited with string → classic pro staging...
9. Missing HDD: Whistleblower files stolen — targeted hit.
10. Sam Altman is a Jewish and Gay who met his husband at Peter Thiel's "Toilet Parties" (Drugs and laxatives) Think 2 girls 1 cup but 75 feeders and 75 eaters (if you know you know)
youtube
2026-04-08T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyNgQiRaLydCImNHUh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyz1YGxxU-7DAaGKuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwL_58rZyyUr7k_BNZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwIqv-tfKTHC_BHagZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx8Ov4N257iyyuUh394AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymDU_j-vldS5XRCcp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwYQZHATLSsF9-QODp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5b18I3RzqJIYX6e14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwb0jVX3atss1YnHkN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxG9MrGTr6c3By-YQd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}
]