Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The incompetence is mindblowing. Could this be a good argument for software engineers to get their professional license?


It seems abundantly clear that Equifax's incompetence is _systemic_. Under the presumption that they could have hired better engineers, I fully believe they would have managed them into submission.


The kind of licensing here would (or should) provide significant negative consequences for malpractice, possibly including revocation or suspension of the license (and therefore prohibition of working on projects requiring licensed engineers) and even civil or criminal penalties. It also carries credibility and protection: a licensed engineer has a duty to report employers' attempts to circumvent rules like Equifax hypothetically would have done, and legal protection for his livelihood when he does so.

It may not prevent truly unscrupulous or spineless engineers from capitulating, but it's better than the current situation.


Or you know punish the managers for once instead of the footsoldiers...

When Wells Fargo had their credit scandal the salesmen shouldn't have been punished, their managers should've.

These things start at the top. When deadlines are pushed onto you, you don't have time to write unit tests, refactor, update dependencies.


Licensing empowers the engineer to refuse to do something that violates sound engineering practice according to the license and have legal recourse against retaliation.

It isn't perfect, and the imbalance of power will certainly still be an issue. But that doesn't mean we shouldn't try.


> Licensing empowers the engineer to refuse to do something that violates sound engineering practice according to the license and have legal recourse against retaliation.

It would just put most legal liabilities on engineers vs the org. It's a great way to protect management, that's the only thing it's going to do. That's exactly how dumb traders end up being scapegoated with each financial scandal. Any engineer who would dare report any wrong doing would be blacklisted for life from the IT industry.

Business like Equifax already have legal requirements at the org level, let's not shift all responsibility onto engineers.


I'd be amazed if the average/combined skill level of engineers at any large company exceeds the average/combined skill level of the people trying to compromise its security.

And that's not taking into account the bureaucratic overhead necessary to make changes in such an environment. There are very good, and very bad, reasons why upgrading insecure software and fixing other security holes takes too much time and effort.

Equifax just happens to be a very attractive target. I don't know how any such target can stay truly safe.

(Having said that, they clearly screwed the pooch in a lot of ways, so I won't shed a tear if they're dismantled.)


Libraries, frameworks, and other security systems don't have to be developed in-house. It's just like basic data structures and algorithms: few ought to be rolling their own and should instead be using libraries.


All of those are insecure, so it's still a matter of staying ahead of attackers. And avoiding social engineering. And making certain the code that glues those libraries and frameworks together is secure. And making sure people don't accidentally leave an S3 bucket unsecured. And making sure every 3rd party contractor on-site doesn't take advantage of softer internal security. And making sure employees aren't bribed by competitors.

And making sure the business can still function while doing your best to limit functionality.


Yes and no; it was Apache code that was exploited. The failure tho' wasn't technical really; it was the lack of urgency in patching once the flaw was known, which is 100% on management


Or managed into finding another job, most likely.


This particular site looks like it might not have been touched for a decade or more.

So there is also the argument of "any engineers at all" vs "better engineers".


A professional license for what? CRUD apps? a CS education doesn't even make one a web security specialist. What's next? forcing corporations to use Microsoft technologies to stifle competition and innovation? like big vendors never release insecure products? like big consultancies never develop insecure apps?


No, because then you get things like this happening:

https://www.clickondetroit.com/news/fake-architect-sentenced...

Also, consider how much of the software you use on a regular basis would not exist, if mandatory licensing were in place.


I'm not sure requiring a license to practice software development is a good idea, but it does seem that we could use some rules around development and maintenance of important applications. Perhaps legally mandated security audits for anyone storing things like financial data would be useful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: