My talk at App Builders 2016 was on Apple’s best programming language. Spoiler alert: it’s Dylan. Or is it?
I chose a few properties one might wish to find in programming languages, then demonstrated how these were all present in the Dylan language. I also took a dig at certain other languages, which do things in ways that could be seen as less good than Dylan’s. For example, did you know that there’s a programming language out there which distinguishes constant from variable values with the words let and var, rather than using the arguably more readable constant keyword?
Now here’s a thing: of course there are many things iOS app programmers could have been using if we didn’t want to use Objective-C, without the addition of Swift. Indeed, a few months before Swift was introduced I enumerated some of these alternatives on this very blog. However, many of us chose to use Objective-C rather than any of the alternatives, and then chose Swift when that alternative was presented.
Similarly, Apple could have pursued any of those alternatives, and indeed did pursue quite a few of them. What would the world look like if Apple had invested in MacRuby? It would have Swift in it, we know that because they did and it does.
At the end of my talk, I invited the conference to discuss what it was particularly about Swift that led to its brisk success, when it can be considered equivalent to many existing alternatives in numerous ways. Here are some of the suggestions (none of them from me, all from the audience):
- marketing
- evangelism
- LLVM
- Apple now isn’t the same as Apple in Dylan’s time
- “Halo effect” from iOS
- people only want to use first-party tools
- Objective-C pain provided the opportunity
- big enough community to reach critical mass
That leads me to wonder how closely related the conditions for “better” and the conditions for “accepted” are, whether there are “better” things out there for programmers that haven’t been adopted, whether those things truly are better, and how aware we all are of the distinction between being better and being popular when we make engineering choices.
The more I use Swift – and the more I see it evolve (especially the future evolution being discussed on the mailing the list) the more I’m converging on a conclusion:
Swift is not perfect (no language every will be – except Haskell – which keeps redefining what perfect is anyway). It may not be the best for any given specific thing – but it’s mostly pretty good at everything it needs to be (more so than all but a couple of other systems languages) – and is the best overall fit, IMO, in its role as a “modern” language that can be used with the iOS and Mac frameworks and is well suited to UI-centric code.
This is more than a “good enough” thing.
We can all easily point to a particular language feature and say, “language X is better at this because…”. But for almost all of those you can come up with a counter-argument for why the Swift approach is better for it’s intended usage. The number of exceptions to that is diminishing.
For example, I was disappointed at first at the verbosity of sum types (enums) and pattern matching (switch). It was only when I started trying to teach them to people not already familiar with the concepts that I realised how seamlessly they scale from the old-school, degenerate forms of enums and switches that are familiar from C-based languages – right up to fully fledged discriminated unions an powerful pattern matches.
Your opening point about “let” for constants is interesting too. My initial reaction was that “let” is perfect (implies a declarative name binding rather than a memory location), but “var” was the problem, because it’s just as easy to declare something mutable as constant and I felt the default should be immutability.
But Swift is not an academic language. In the real-world, systems-level, UI/networking-heavy world of Mac and iOS app development, with their OO frameworks, putting mutable types on an equal footing with immutable bindings is probably the right call (although I’m still unsure on this one – the point is reasonable an argument can be made).
So that’s where I think the lines should be drawn. Swift exists in a context – has been designed for that context – and, I think, uniquely fits that context in a way that external languages can probably never do.