Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's an inevitable step in development for Ai usage, how naive and delusional ca…
ytc_Ugw9eqmj4…
G
I’ve seen an assload of AI videos by smart people at this point and this is my f…
ytc_UgwwpmmVD…
G
they need to be turned off and baned by the federal government you dont need ai …
ytc_UgyqSJ7QJ…
G
I feel like I would be forced to do more cognitive work for companies to train …
ytc_UgxNd-oMc…
G
AI may not be intelligent/conscious but it's damned accurate and fast. It can do…
ytc_UgzYkbVVv…
G
Been training AI for several years now and @10:28 is a bold-faced lie. The amoun…
ytc_UgxvEDLhZ…
G
Here are a few working theories people kick around:
Universal Basic Income (UBI…
ytc_Ugx5z67-W…
G
This is setting a precedent in AI: For every "Good Actor" leaving the field it i…
ytc_UgymFB3xG…
Comment
Does not matter, someone will continue so everyone will continue. We will not be able to control ASI and at some point it will get to that stage, then if its now or in 100 years.
100% will be automated at some point, at least when the automation starts to produce faster than we can multiply. Yes we can have filler jobs just to make sure the ASI ain't doing anything but if it would be doing something we would not understand so it would be us thinking we have control.
Will it be chaos while we move towards it, probably because humans won't evolve just because the AI evolves.
Like we might even be the ones killing ourselves before ASI can save us from ourself.
For fun, Imagine this scenario "China develops ASI first and to be able to fully benefit from it you need to install a chip that is connected to your neuron network for access to improvements". How many people in USA would install that chip? How many would rather start a war to capture the ASI from China? Like war is the only path humanity is able to take so if the ASI ain't fast enough to save us then we will probably destroy the entire planet just to stop the chance of someone else gaining control of it.
youtube
2025-09-17T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzeZGw3aZIW3df-TCZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLyGofstFxbgnITC54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwupyS4cENQRuelU0x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSzrc0AjRslU7yafF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzHRrSpMiDsiNxKyRl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUp_W4OaUHQ1DX9Dh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwnStDeWPK-BYyvlZB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz5paIl3sWPgMtD0St4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx4wX4dyZfAbUcIL5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJWFt2zAJTJsu5Vid4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]