Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It makes a difference in the right to that land. When the USSR was at its max sp…
rdc_d2xg9z8
G
Because they are trained on human conversations.... They are just madlib generat…
ytr_UgzLhzFyZ…
G
Tim, you need to do your homework on what actual fair use is. This man was right…
ytc_Ugwdx_FkY…
G
Computers can not react like humans .. they can only react to what someone progr…
ytc_UgzqWn1mG…
G
You should watch "I Hate Twitter Artists" by HarmfulOpinions. It's a critique of…
ytc_UgzTGDulk…
G
We are already under AI, moon is a computer, we are eternal spirits but for some…
ytc_Ugw6gmcbs…
G
a collage of ai pics would be more creative then a singe pic prompt at least…
ytc_UgyxUPtMr…
G
I think yall are very short sighted. AI generated art RIGHT NOW is very "you get…
ytc_UgxHcDkYj…
Comment
You can't have self driving cars along with human drivers. People do soo many things that are considered "illegal" but makes sense to others and are rarely enforced. An example if a car is blocking your path you can pass a double solid line and re-enter your line but would have to make sure that traffic isn't coming in, from my understanding self driving cars would just stop moving.
Hence the only way driving is safe and efficient for everyone is when every car on the road is self driving or is not. You shouldn't combine the two together as it might make for some bad results.
Im hopeful we'll get to a solution of fully self driving cars in the near future
youtube
2024-01-09T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlDauPln9US6EIuPt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw8kPSMHARD2HOqVQx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6qNzsGNahxUUVU0B4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzH-mPq63xDRrjDfol4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyRUb5KCizEJ1Bs43F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzX94dOZIwiEyySP6p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEGOtPXLipjXDA70F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwg4ZpyNbyPhjFhRx94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2VwkTSMvkRCsZ2OR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5dqAPvaPbYc3bItV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]