Hacker Newsnew | past | comments | ask | show | jobs | submit | PikelEmi's commentslogin

Arduino as a programming language?




I do a daily writing exercise that is to write at least 1000 words, whatever comes to mind. No editing. Just let it flow and see where it takes you. When you get used to it (which you will do very quick) you will be surprised how it improves your writing skills.


You absolutely do not have the best behind you! My bug "turnaround" came in the beginning of mt 50ies. Finally got to a stage in job life I like and also meet the woman of my life (after many years as single) and got married. I still feel that I haven't had the best yet when it come to job life and I plan to continue at least 20+ years working and have no plans of retirement. Today it make no sense at all to think about age, just keep up your motivation, drive and stay healthy and you can go on "forever". Find out what you really want to do and do it. There are no limits here in life


"Also, Ada’s strictness and correctness may be perceived as an anti-feature. When talking about general-purpose programming, businesses need fast development cycles and short delivery times. For the companies, to be successful, it’s far more important to be ahead of the competition even with an incomplete product than to let the customers wait and eventually abandon the project."

That is not necessary true when it comes to aerospace and defense projects. In these cases it not important to be ahead and have short delivery times, so here Ada is still suitable.

That the language is a general-purpose programming language does not mean that it is suitable for all application domains. You always need to use your common sense and experience too to choose the right tool for your task.


That is not necessary true when it comes to aerospace and defense projects

They chose to use C++ for the F35, which is buggy as hell, because they thought there were no Ada programmers available. But if you are planning a multi-decade project wouldn't you just... train some yourself?


Points to a general problem (in the US at least) of a complete failure of any sense of on the job training, or internal advancement.

Companies complain when they can't find purple squirrels, even where they could find someone with great general skills in the profession and train them, in about the same amount of time it takes to onboard someone with precisely the skills you want.


It's not just a matter of training.

Let's say you get a job offer from a company which the worst case of NIH syndrome you've ever seen, and all their code is written in a programming language that the company created and the only code in the world written in this language is at this company. Even if the company agreed to train you, would you take the job offer?

Chances are, no, because if you were to ever switch jobs in the future, you want to still be employable, and not have invested years and years into a stack that's completely irrelevant to the rest of the world.

So if the company is of such a literally enormous size and permanent standing to be able to make you a believable offer of lifetime employment / tenure for learning their untransferable skills, like a military or other government body, then maybe. For a defense contractor which could go belly up after some upper-management scandal? No way. For anybody else in the private sector? No way.

Either way, it's still problematic because your hiring mechnanics now revolve around unfirable employees, and the easiest way to demotivate employees is by putting them in an environment where mediocrity is accepted because the perpetrators of the mediocrity can't be fired.

Ada's problem is that it was only a defense-sector language, for reasons specified by others in the thread e.g. compiler cost. Ultimately, targeting tooling to only a specific industry is dooming it to failure; the tooling must stay relevant to many companies and industries both so that people will see it as a career-advancing skill and so that the labor market remains fluid.


Even if the company agreed to train you, would you take the job offer? Chances are, no, because if you were to ever switch jobs in the future, you want to still be employable

A counter-example would be Goldman Sachs, with their own language (Slang) and their own database (SecDB), former Goldmanites have no trouble finding work after.


An interesting point which I will concede. But I doubt the exception disproves the rule for the general case.

For what it's worth, GS was founded in 1869, which makes it older than most of the behemoth systems companies like IBM (1911) or AT&T (1880). Big banks like GS may not be as invincible post-2008 as they used to appear to be, but if there are institutions which you could say "this institution is not going to evaporate in the next few decades which will make up my career", GS would not be a bad bet.

Which brings an interesting question, how many former Goldmanites a) worked primarily in Slang (given that not all Goldmanite engineers work on SecDB) b) were laid off or otherwise left GS without having a job lined up and c) then found work? If you can prove that you can work on a different stack and get a job with no frictional unemployment, is a much different picture compared to engineers who stay out of work (the parallel question being, why is Slang so much more de-facto employable than COBOL or other languages which for one reason or another are now unpopular or unused in the wider industry?).


How do you ensure they stick around? Training has a high cost that isn't worth it if the people jump ship 2 or 3 years later as is the norm in this industry.


Well, it's a reason as good as any other to start paying good salaries AND good raises, instead of just the former. And don't skim on the benefits.

As a general rule, treat your employees fairly (which includes updating their salaries up to market value, as often as needed) and, lo and behold, they will stay. There is a lot of talk about changing jobs because of a lack of challenge, wanting to try new technologies, etc., but 99% of the time, it's about money (and benefits).

Admittedly, this idea might not be very well received by most companies.


I dunno, nobody I know is actually looking to job hop. If you treat people decently, give them raises, train them, why would they have any incentive to leave?

Hiring somebody new costs 15-25% of the raw salary cost for the year. Pay people 10% more instead, no incentive to leave.


Plus however long it takes to ramp them up, could be 6 months on a reasonably complex codebase. And the time of other engineers both in the hiring process and bringing the new engineer up to speed. I’d say the cost of replacing an engineer could be as much as their entire first year’s comp.


Perhaps if you invest in people, they wouldn't do that? I mean people job hop because that is the only way to advance, but what if there was another way? Job hunting, interviewing etc are a royal pain in the backside. If there was a way to have a career without all that everyone would be happier, employers included


People jumping ship 2 or 3 years later is a result of the market correcting itself. There was a point in time when people stayed for 20 years in a company. Why? You could switch, but the pay would be the same.


There is a cultural factor too, many companies seem to have an aversion to promoting from within. It's usually far easier to get promoted by interviewing elsewhere for a higher position than it is to deal with the internal politics of moving up within the same org. But it could be a clear expectation: complete this training, get this experience, move up a rung.


Then your employees aren't commoditized, replaceable sources of programming work who you can hire and fire at the whim of the budget cycle.


Defense is all done by contractors or lifers, far away from the competitive tech hubs. I don't think they have a retention problem, unlike companies in the Valley.

Also, the alternative is C++ and C++ developers are not a replaceable commodity either.


There are plenty of defense contractors near tech hubs. Defense was a big business in the SF Bay Area long before there were software companies there, and it's still big today:

https://en.wikipedia.org/wiki/List_of_companies_based_in_the...

In the NY metro area, there are lots of defense contractors on Long Island.


Yes we are, mostly. There are very few programming jobs for which the laborers aren't essentially replaceable commodities, including programmers who are proficient with C++.


My tiny aerospace knowledge also tells me that there's a ton of cpp tooling that is already known to shops and they rely on this rather than switching culture to ADA even if it may provides greater value in mid to long term.


My tiny aerospace knowledge also tells me that there's a ton of cpp tooling that is already known

All that tooling isn't helping produce the software tho' is it?

https://www.theregister.co.uk/2017/01/12/f35_alis_software_d...

And

https://www.theregister.co.uk/2018/03/22/f_35b_block_4_upgra...

For just two examples


I never implied it was useful, just that people are more likely to take cpp and coat it with tools to feel getting things done by juggling masses of files.


>which is buggy as hell

Is this really the fault of C++? Ada was appreciated for providing static code analysis out of the box but they found out that coupling C++ with a (modern) static analyzer gave better results.


> > which is buggy as hell > Is this really the fault of C++?

I'm not saying it necessarily is, but it clearly can be. When you fuck-up language semantics, as pointers in C and C++, making things vague and unspecifiable regarding important properties (aliasing, memory allocation, etc), then you lose something that is extremely hard to get back via static analysis.

For this reason, a language that has safe properties in a domain (like Rust or Ada) will always be safer in that domain, by construction, that a language where you graft static analysis on top of it.

This can be somehow alleviated by coding rules/standards and annotations for static analyzers, but then, you're not programming in the same language anymore, which has its own set of problems.


Well, the F-22 had control software written in Ada and it doesn't have as many reported software problems as the F-35.


The F-22 is also a decade older.


I think a time comparison of both would still show the F-22 had less software problems. The F-22 also has some seriously complicated software when you look at the capability.


It's a fair assessment. There are almost no Ada programmers available. There will be less and less in the future.

It's a difficult language, on par with C++. It's tough to learn, even if you already have a decade of C++.

I think it's fair to pick Ada for that kind of project but it is also fair to avoid it. Either decision is justified.


Ada was relatively easy to learn by anyone in the past with Object Pascal or Modula-2/3 knowledge.

Which used to be quite common before C and C++ got widespread during the mid-90's.

Ada's biggest problem was the price of the compilers, and lack of adoption by OS SDKs, an issue that also pushed other languages outside of mainstream.

Businesses already had to pay for the OS developer tools, unless there were legal requirements on their domain, they weren't keen in buying extra compilers.


I can confirm teaching basic Ada to newbies to "go and correct logic bug X" is 2-3 weeks on-the-job training. We don't even go for Ada experience when hiring anymore, just programming experience, and low cowboy factor...


Agreed on that. Ada was niche from the start because the tools were expensive and hard to procure.

Then Windows and Linux became the dominant platforms and they are C only. Everything else died.


More like Windows and UNIX, not only Linux, but yeah that's the spirit.


> UNIX

What UNIX? Only one I can think of macOS? What other UNIX is a dominant platform? Linux is not a UNIX, but rather UNIX like.

I'm all for UNIX, love the design, love the history, but it is not dominant. To be honest UNIX seems dead, and only live on in *nix-like OS.


Within this context of this discussion, what are the distinctions between Unix and Linux that are important to you?


That Unix is the roadster that gave us c. But Linux was the one that drove c into our core operating system infrastructure. Windows did the same favor for c++.

One of the reasons why other languages failed? While c/c++ flourished in the oss environment, even with its tooling being sub-par at the time. imo.


I almost agree, just not quite regarding how it went.

C++ was already being adopted by all desktop systems, Mac, OS/2, Windows, even if at lower level it was a mix of Pascal (Mac) and C.

On proprietary UNIX systems, C++ major stronghold was CORBA based applications, while some companies were pushing C++ bindings for Motif.

Then GNU and Linux happened, with FSF suggesting that the best approach to write portable open source should be C, and then we arrived in the present situation.


There are more computers on this planet than just plain desktop.

Most of them are servers, mainframes and embedded devices running some form of UNIX based OS.

If you prefer I can rename it to nix, as I wasn't thinking about POSIX certification.

BSD, Minix (running on your Intel chip), Aix, HP-UX, POSIX layers in IBM and Unisys mainframes, RTOS, NuttX, QNX, and many other POSIX based OSes for embedded deployment.


> If you prefer I can rename it to nix

This makes me sound pedantic, but yes. I understand the world is more then plain desktops. The topic was dominant platforms; UNIX is not.

I'm aware of most of your examples. But all of them are either unix-like or posix complaint.

But as admitted my complaint was a nitpick.

"UNIX is dead; long live *nix." ~ oneweekwonder


> It's a difficult language, on par with C++.

Ada is much simpler than C++, the hard part about learning it is the lack of tutorials and general information. That has changed in the last few years, but for the longest time the Ada community would just point to the (freely available!) standards doc to answer all questions.

The spec isn't that useful if one doesn't already know the concepts being explained, making learning the ideas behind Ada not that easy, but fundamentally those ideas are not hard.


Everything that exists in C++ also exists in Ada under a different name and syntax.

The separation of cpp and header files, classes, inheritance, different kinds of inheritance, virtual, pointers, references, pointers of pointers, exceptions, meaningless errors from the compiler, etc...

Personally, I find both languages to be a complete clusterfuck. The difference is that C++ will compile anything and crash for no apparent reason whereas Ada will refuse to compile anything.


> But if you are planning a multi-decade project wouldn't you just... train some yourself?

The reality is most businesses are rather short sighted and prefer to save on up-front costs. Never mind how bad it can come back to haunt them in the future...


Ada's strong static typing may also produce less buggy code.


I don't think it helps producing fewer bugs, but it makes writing some kinds of bug much harder.


Here's an apples to apples study proving otherwise:

http://archive.adaic.com/intro/ada-vs-c/cada_art.pdf

The others from that time period by the military and defense contractors showed something similar. In only one case did I see C++ and Ada tie but why wasnt evident. Note this was before automated proving in SPARK, Design-by-Contract in Ada 2012, and co tract/type/property-based testing tech. The numbers would likely be even higher. Of course, one would want a comparison against tech designed for same things in C which are plentiful now.


I was talking specifically about the strong typing, but this is an awesome study comparing two languages with strong typing. Thanks for sharing.

There probably is a point where the strictness of the typing starts influencing the bugginess of the code.


C isn't typed strongly.


The parts that allow you to do unsafe casting are easily detected by a static analysis.


You'd be surprised how much code you can move from conditional instructions to data types if you have a good type system, and if this type system is a static one, you get it checked before your program even starts.


F# has this too. Other languages too, of course. Heck, even Pascal has a limited form of it - enumerated types and user-defined array types etc. - years ago.


This two statements seem to contradict each other.


You'll write different bugs in different languages. You can't leak memory in Java (you can, but it's very hard), for instance, as you can in C. In Python, I often write code that barfs when it gets an object of a wrong type because parameter types are simply not enforced. The bugs I write in Lisp are entirely different from the ones I write in C.

We'll always write bugs.


But if we agree that Ada reduces one kind of bug compared to, say, C, then we'd also have to come up with a class of bugs which Ada makes easier to write, in order to say that the number of total bugs stay constant while one kind decreases.

I can't come up with such an example.

You're right in that things are rarely black and white, but when it comes specifically to bug prevention, I think Ada is a strict improvement over many alternatives.


That's not the point, though. The point is that Ada lets us write without much extra effort:

1. Way fewer bugs in general.

2. Fewer severe bugs that become full-on crashes or hacks.

3. Fewer bugs we cant fail-saif on and/or recover from.

Not all bugs are equal by far. Just see Rust's panics vs C's donate-PC-cycles-to-hackers technique.


> C's donate-PC-cycles-to-hackers technique.

I'll totally use that.


The same could be said about Haskell and I see it a lot in the startup space.

Sometimes, being first to market is less of a problem than being the first to not to scramble your user's data.


An incomplete product is one thing, but failing type checks implies a non-working product, not an incomplete product. Businesses need correct code. The real question is what is the fastest way to get correct code, using a language that isn't as strict and testing the heck out of your implementation, or using a language that is stricter but harder to learn and write?

It seems to me in both cases you'll still need to test your code thoroughly, so not an easy question to answer.


All people are in space right now.


How many was reading this article while on work?


Awesome work you did there!


People who like to read - they read. No reason to make "systems" for reading - you do it if you want to.


What about people who want to read but don't like to read?


People who like to make stuff, they make stuff. No reason to build "businesses" around their stuff.

People who like to exercise, they exercise. No reason to create "routines" for exercise.

People who like to cook, they cook. No reason to develop "recipe" and "meal plan" to cook.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: