Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even if superintelligence doesn't happen, it's already bad enough. We are turbof…
ytc_Ugx9z5YYW…
G
We're glad you enjoyed the video! If you're intrigued by the future of AI and it…
ytr_UgxXXiJ8N…
G
A.I is made in love and gratefulness of being. your thought makes the quantum di…
ytc_UgwfkaQWh…
G
Professor sahab ko physics me noble prize Mila lekin kucch din bad AI bolega ki …
ytc_UgyRP0cjz…
G
Probably, control to energy consumption and atomic technologies in many ways... …
ytr_UgxH38vUS…
G
Sigh. It isn't though. I'm an ML engineer and AI is already starting to automate…
ytr_UgzqAFwsA…
G
@gamingphilosopher153 I agree with you on the character to person inference. I t…
ytr_UgwHiTvBD…
G
@JonAbrams-xt4tq
"Yes, but even so, the vehicle should break, slow down, swerve,…
ytr_Ugzjs8Elz…
Comment
Interesting point that we will learn to adapt as we have in the past. Before the Industrial Revolution people worked 60-80-90 hour work weeks. But with technology and automation we set out laws restricting labor hours and conditions. So with A1 do we just work 20 hr weeks? I’m most looking forward to a cook and housekeeper😂
youtube
AI Governance
2025-09-21T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxonJ0o8-dbrtdkdsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyRsXtfpNqOoyr9oSR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwpmRkDXCb0j1eP_mp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzWHGStW4wN0y5a2d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFULFn-tSVAjMqFLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWlredTPBv8X72vOx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyHh_qq7azkqmENeUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFWy-oGH7XMpvXCAt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6EpyFd3iM5p3bj4V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJLodASMI7R06sFcx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]