Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not exactly sure what you mean. But can you name a person alive right now who is…
ytr_UgwqkUO2p…
G
The self driving feature is there to assist with driving, the crash was do to th…
ytc_Ugy3L76YY…
G
Lol fun fact theyve used/accepted ai before. I remember it was like a summer pro…
ytc_Ugyoi2eFQ…
G
Each pixel is a plagiarism. Every pixel is meant to mimic something. I don’t fin…
ytr_UgxReG7si…
G
They're currently trying to teach children how to set their houses on fire. This…
ytc_UgwxZRNU5…
G
I've been saying and preaching this in this sub for a while now. I believe it's…
rdc_m94be2g
G
Do not buy products or services from companies that use AI to replace humans. T…
ytc_UgyWH_Yj3…
G
The way i see it,when AI takes jobs,people will finally be FREE to LIVE THEIR LI…
ytc_Ugw6XVvyA…
Comment
I strongly suspect that current A.I. generated by classical computers, which are just long combinations of 1's and 0's are only a simulated conciosness, however complex. A truly self aware A I. I believe is only possible when quantum computers advance. They aren't limited to binary code.
youtube
AI Governance
2023-07-08T03:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwjNNgLoE2mABsaJTZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8rFmPLVXc1_pO3id4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZO09PdAB80qE4TJd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxpy84iCY1lvyvvtWl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6fyOtpR-kBG8Hi1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxfrfLe7AEQ3rcBspN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCzYgdJDjOj8yw4tF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUWtNsXxh0GnybYzh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzS384EM8xchcs8N414AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwgt_xnnfOeHB0vQ414AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}]