• 0 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: July 22nd, 2023

help-circle


  • I was a nuclear operator in the Navy. Here are the actual reasons:

    1. The designs are classified US military assets
    2. They are not refuleable
    3. They only come in 2 “sizes”: aircraft carrier and submarine
    4. They are not scaleable. You can just make a reactor 2x as big
    5. They require as much down time as up time
    6. They are outdated
    7. The military won’t let you interrupt their supply chain to make civilian reactors
    8. New designs over promise and underdeliver
    9. They are optimized for erratic operations (combat) not steady state (normal power loads)
    10. They are engineered assuming they have infinite sea water available for everything

    There’s more but that’s just off the top of my head





  • But we do though. Maybe not exactly test every possible scenario. Typically when we make a design decision we plan for the worst theoretical condition the part will be exposed to. Then we plan for 5-10 times that. Think about the cost and effort added to everything with that level scrutiny. We design for fringe cases. That’s the point I’m trying to make. It’s insane to me that because it’s software, companies get a free pass on that level of scrutiny. As software takes over more car functions that becomes more concerning. It’s bullshit that I’m part of their beta test.



  • The thing that kills me about this sort of thing is the complete lack of accountability. Working class people at assembly plants, dealers, suppliers will all feel the sting from the drop in sales. There’s some dipshit MBA at GM who made and pushed this decision. Any rational person could see that GM is not is a position to push their own infotainment system. Car play and android auto are beloved. Not having one or both is a deal breaker for new car purchasers. We will never know who this person is. Making such an outrageously bad business decision should result thing this person being blackballed from any kind of business role. But that will never happen.






  • I completely disagree. It absolutely is AI doing this. The point the article is trying to make is that the data used to train the AI is full of exclusionary hiring practices. AI learns this and carries it forward.

    Using your metaphor, it would be like training AI on hundreds of excel spreadsheets that were sorted by race. The AI learns this and starts doing it too.

    This touches on one of the huge ethical questions with regulating AI. If you are discriminated against in a job hunt by an AI, who’s fault is that? The AI is just doing what it’s taught. The company is just doing what the AI said. The AI developers are just giving it previous hiring data. If the previous hiring data is racist or sexist or whatever you can’t retroactively correct that. This is exactly why we need to regulate AI not just its deployment.