Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
within a century i honestly believe we don't have to worry about ai and robotics…
ytc_UgyI86sUR…
G
Who are better cause humans are the ones that made AI and the humans weren’t as …
ytc_Ugx7K05ue…
G
what people need to do is start changing their way of viewing our economic syste…
ytc_Ugj7Jxjog…
G
“Elon Musk is not an AI expert” omg, who is this clown? 😂Has he been informed of…
ytc_UgwV0C3pX…
G
This is what I do, work in pathology in the labs specifically with human tissue …
ytr_UgxQXpc_J…
G
Self driving cars are NOT a double edged sword, they are a bat with a bladed han…
ytc_UgxqccULO…
G
[Glue pizza and eat rocks: Google AI search errors go viral](https://www.bbc.com…
rdc_n8lrj0v
G
leave the decision in the hands of an AI that thinks in absolutes, great job hum…
ytc_Ugw1QhiaV…
Comment
Great podcast. I wonder what you and Roman think on Ai integration with human biology? Roman's points were mostly around Ai/AGI/ASI being a distinctly separate thing from human biology. Ray Kurzweil believes Ai and human biology will be one of the earliest integrations as the next evolutionary step in human development. We'll have a set of "augmented humans" and non-augments as AGI becomes integrated with humans on a biological level. We'll go from avg lifespan of 80 years to 180 years and so on.
youtube
2024-06-13T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywyNkEJJTP-cET_9l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-M0Ls8ztQaqIeYR14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzk8tfA_XBVFPvGiud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwptr3ij6Bh0ojGKsN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwgNwU9DjmJoMz5Aph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwUyQGutFz8rvH9AiV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyA1-Wkdb7wKrcTTsp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwgm1XB0kPy7Fj8jcl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydrgoesQvcBzt3HhN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuKgsdaExoxWJc5JJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]