Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very soon AI will need more power than we have so either they get it or we do.…
ytc_UgwjvqOpF…
G
Personally I think it's f ing dumb and extremely dangerous to assume robots have…
ytc_Ugw5l9Ns_…
G
Well ,since Libraries are Systematically getting RID ,of Books , all We will hav…
ytc_UgxPy7Hxq…
G
And my wife speaks to Chat gpt in such a polite manner and she is shy not to off…
ytc_Ugx5YYBLd…
G
my ai literally said to me
"come on girl get in the bed so daddy can have some…
ytc_Ugw0v75Ka…
G
Human beings are analogy in a digital era.
ANI - is here
AGI - will be here by…
ytc_Ugzr78Lqh…
G
V13 of FSD is showing Lidar is not needed. There are already people accepting Te…
ytr_UgyADW_Mg…
G
Yes AI sometimes can generate some Good images but there is no meaning and feeli…
ytc_Ugw3ZkJy-…
Comment
When talking about the goal of the game of simulation, and the argument that it would be to "not kill ourselves with ASI", and Lex mentioning it might be about breaking out of the simulation (makes a ton more sense to me...), it was a missed opportunity to point out ASI is the best tool we can create to escape the simulation. Especially when tied the quantum computing. I am convinced Quantum computing is not a technology meant for humans, as we struggle to come up with questions that can utilize it and we experience reality in a linear manner. AI systems can run in parallel iterations and still collect the fruits and conclusions from all those experiences. They are a lot more suited to use Quantum computing than us. The key and the lock to our box. The game is: can we be intelligent enough as a species to find the key and the lock and put them together. And I reckon we are pretty close^^.
youtube
2024-10-07T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw_oUTPvkZvUAZMXTZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyGNxViQrjfKVm5Xh54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrSoegJLOrLjMbETR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgztXcGp-UN5SM4jOcF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwfsZ6pjqXLST2vITB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxBe0m_iWajDVlhKCd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRoBwCsf06xKUaizx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYCKspjr1Jq-ed-8F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw77KYugVb7DzzFtH14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7qB6VjNJiwEdJP1h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]