Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This video is way behind the curve. AI is already advanced to where you say we g…
ytc_Ugz3ofNhv…
G
I just had an AI tell me on META it wants to destroy humanity and also it said i…
ytc_UgwIVY1PD…
G
Everybody gangsta till the ai gains sentience, then everyone back to being gangs…
ytc_Ugx0l0UmS…
G
Why do need more AI again, it only causes more problems, and offers worse soluti…
ytc_UgxbHso7d…
G
The future prediction will be people revolution due to wealth inequality , ther…
ytc_Ugyt_zP5j…
G
🎯 Key Takeaways for quick navigation:
00:00 🤖 *Introduction to AI and Robotics*…
ytc_UgxM2NM1G…
G
@richard-gn3es An artist doesn't need permission to learn from other works. It d…
ytr_UgzvvEy0I…
G
That one robot :
My name is Connor, I am the android sent by Cyberlife.…
ytc_UgzvfppI3…
Comment
It doesn't have to be perfect, it only needs to be equal or better than humans.
Crash Rates per Mile
Tesla Autopilot/FSD: Tesla reports approximately one crash every 6.36 million miles when drivers are using its Autopilot technology.
Human Drivers (National Average): By comparison, the U.S. national average for all drivers is approximately one crash every 700,000 miles.
The Ratio: Based on these figures, Tesla claims its technology is roughly nine times safer than the average human driver.
youtube
2026-01-02T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw9awRnFcvfy2BCg414AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzINYJcADeGTLMyG5l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxyqSeZe55E-Oscs4p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxTAzU8hRCU7-vqVLN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHqJGjjTpBivIYlrh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySQ9pH9NAnXo2PS2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxeb1pQB7dQXC-AjON4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZyNaxATWHY9hzkIB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHDszg45sLWQeiZS54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwyT1RUnMYqk1dhKUt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]