Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI (or at least LLM) are interesting because people believe it's intelligent, th…
ytc_UgzXaTwnq…
G
Once I put in a braille dit ASCII of an Among Us guy and the ChatGPT ai thought …
ytc_UgyAq2BM7…
G
Already happening. My friend just got replaced by an AI. She was fired yesterday…
ytc_UgzfWEUJT…
G
Even if it was a so deemed talent, these people have absolutely no idea of how t…
ytc_UgyVGBeTI…
G
Come on...what an intellectually challenged & careless human...that answers why …
ytc_Ugwcs3LPd…
G
@LilloDolla999 100% this is the first time in history that 'high skill' and 'kno…
ytr_UgwsOUHFt…
G
No wonder the matrix and Terminator both movie series had a common main antagoni…
ytc_UgzqukvIT…
G
@PASTEL._.777XD the more complicated your design the harder it will be for AI t…
ytr_UgxN4cO1_…
Comment
Ankur, please research this more deeply. AI isn’t just another innovation like the Industrial or Computer Revolution — it’s fundamentally different. This could be the last major creation humans ever make, as we’re essentially building a creator. The few jobs that emerge to manage AI will only exist temporarily, until AI becomes capable of managing itself — remember, it’s intelligent. Within the next decade, there will be no jobs left that require human thinking — none at all.
youtube
2025-10-15T04:1…
♥ 55
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyvVktenvTPVhig2fN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGInIhIh9rcykOsXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwliONC1ISazS-b8A54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztkQa_sULfGqilM2F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgztysOzZk1fj_pnPEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCTc_X9Lm3xBXjGCN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6siuulwgFp7YDNLh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgytG2uJhjTkxB8LRRR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxW23LYd_y_rlmi1pZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrciEbkG0r5mQAdf14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]