Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only way someone could make me look at ai art is if I want to tourture my se…
ytc_UgzOSBvPA…
G
"Experts" always get this wrong. "Oh, AI is so dangerous is Altman the guy we wa…
ytc_UgwcMbOVc…
G
Scariest AI conversation I've ever head. There are obviously algorithmic prompts…
ytc_Ugzrcn8Is…
G
Is this guy aware of those websites where people spend hours roleplaying with AI…
ytc_UgwFE-FHa…
G
Can you make a video in Title inflation among software engineering roles post AI…
ytc_UgzzhANk9…
G
I've been saying this FOR YEARS !! ... : " Forget 'plastic straws/bags' ... in t…
ytc_UgwbfvML2…
G
If AI and robotics were to take all jobs, which I suspect they will take most jo…
ytc_UgxI2104f…
G
I can't find anyone that's saying I can't wait for AI to come out and........
Th…
ytc_Ugy_E0h7i…
Comment
You'll never convince me that AI will ever have any form of "Sentience." The real danger of AI is simply what Humans program it to do, Not AI itself.
Most of the examples given here is as you presented, that it uses human speech to garner "connections." It's just a program that uses this, nothing more.
youtube
AI Governance
2023-07-07T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZon6b-Q1NHCYcLPN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxyDEVVYS7ZtiTgeSF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz94wP5JGrChj8-IVF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHCeBdpIebh4Gj_ax4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxHt1YvcljuNrMfbsx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWpoHBZNGsfX6pMBJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCdkBkoUy6qGCf55t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7cl44rk2dykDc5F14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzhI6-qnbT5uhCXDUF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvqezGa-lnuerAps94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]