Sunday, October 21, 2012

No One at Apple Understands Software (Part II)

In response to No One at Apple Understands Software, Mark K wrote:
So explain it in detail here. I'm fascinated.
For several reasons I thought not to explain here why I believe that no one at Apple understands software. First, it's one of those things you'd like to explain in person so that you can see if people are following you or not. Second, it's one of those things that might only interest two or three people in the world, one of them being me and one of them being Jonathan. Third, by the time I finish writing, Mark K's fascination will surely have passed.

Then I thought of the words of Iris' little friend Quinn who recently took to repeatedly saying, "What the heck!"  After all, if there's anyone who might enjoy a post posthumously, it's Jonathan, and how often is it that Mark K gets fascinated by anything.

So here goes.

A Little Background
The following background is about me. I wrote it for anyone who likes to determine someone's credibility based on what she's done or what others say about her, rather than by simply evaluating what she has to say. You can skip it if you like, or to make it short and sweet, I'll quote a Salon.com article, The Linux jihad:
As soon as I met Tuomenoksa I realized that this was not your average CEO. A blond, earringed, intense-looking fast talker, it was clear after a few seconds that Mark was a geek, a true hacker. His cred was unimpeachable.
My background in software is unconventional, but then, I'm of a generation of computer scientists in which pretty much everyone's background was unconventional. When I started out, the idea of a university degree in computer science was dubious. People who worked in computer science had degrees in math or engineering or physics or linguistics; it was only the newbies who had actual computer science degrees. While my formal education at Berklee College of Music still placed me in the four-sigma range of backgrounds for computer scientists, it wasn't completely unheard of.

Of course I never actually intended on becoming a computer scientist. I just wanted a day-gig with healthcare benefits. I got a clerical job at Bell Labs where I was the document librarian for a project called 3B5. The pay was good and the company had a great healthcare plan. With a wife and child at home and another child on the way, I felt more than lucky to have the job.

The problem was that I knew next to nothing about computers. Taking notes and recording action-items in meetings, I'd struggle to keep up with all the acronyms, jotting down each one and later seeking out people to decipher them. Since I was, shall we say "desperate" to keep the gig, I decided that I'd better learn something about computers.

After work each day, I'd sit before my TI Silent-700 using a UNIX program called "learn" to learn ed (a line editor), the Bourne shell, a scripting language called AWK and a programming language called C. To put what I learned into practice, I  wrote scripts to automate my clerical job. Within a couple of months I was moved into a testing gig on a skunkworks project called 3B2. Skunkworks being what they are, the testing gig transitioned to driver-development; the driver-development gig transitioned to kernel development.

While working on UNIX kernel, I went to night school at Elmhurst College where I earned a bachelor of science in Computer Science. After we shipped the 3B2, the company put me into a program called OYOC (one year on campus). My wife Rene, our three kids and I all moved to Champaign-Urbana for eight months while I earned masters degree in Computer Science. My degree in hand, we headed to New Jersey.

A few years later, I led a skunkworks to develop a binary-translation system that could (among other things) translate Macintosh to run on SPARC (I had SPARC-stations with serial numbers 00000004-0000007) or MIPS. Turns out that applications generated by our translation system not only exactly preserved the semantics of the originals, but they also ran about five times faster. The system worked so well that the folks at Apple decided to use it to migrate stranded Mac applications from the Motorola 68000 to PowerPC.

Since what we were doing was commonly considered impossible, a group from research was asked to audit our project. Among that team were Al Aho, Peter Weinberger and  Brian Kernighan (the A, W and K of AWK with Brian being the co-author of a programming language called 'C'). Not long after the audit, I was invited to join basic research.

My first boss in research, Tom London and his partner John Reiser developed what we called VAX-UNIX (the UNIX operating system that ran on Digital Equipment's VAX 11/780 minicomputer). Tom's the guy who flew to Berkley (the other one) with an OS tape in hand and gave it to Bill Joy who went on to create Berkley UNIX and to found Sun Microsystems. Working with Tom was fun. We daily came up with ideas that would change the world.

Over the years I've developed operating systems, compilers, and embedded systems. I've written applications for the PC, the Mac, IOS, UNIX and even implantable medical devices. I was CTO and VP of Marketing at small-cap public company in Boston that we sold to Intel. I founded an Internet security company for which I raised fifty-three million dollars in venture capital. I've been granted eleven software patents.

Mainly though, I like to code. Strike that. I love to code.

No One Understands
Enough about me; let's talk about what I think. In particular, I want to make sure that we're clear on what I mean by "No One at Apple Understands Software", or perhaps, what I don't mean. I don't mean that know one at Apple knows about software. I'm sure that most of the software developers there know much more about software than I do. It's just that, from everything I've seen of late, no one actually "gets" it.

I'd liken the phenomenon to an expert guitar player who can't tune his instrument without a tuner or a pianist who can't play without sheet music. Believe it or not, there are many of each. They can play fast. They can play cleanly. The guitarist knows her gear up and down. Yet she doesn't hear when a chord is slightly out of tune because one or more fingers are stretching a string just a bit too far. The pianist can play from pages black with notes, yet he doesn't hear that his time drifts.

Despite how well they play, despite all they know, they don't quite understand music because they don't quite hear music. You record them and they sound great. However you can't use the recording because when you put it in the mix with other instruments the guitarist's pitch doesn't match that of the horn section, the pianist's hits are not in sync with the drummer's.

Similarly, there are programmers that can tell you about every nuance of a programming language. They can recite command line arguments to the most obscure UNIX or DOS commands. Yet, when you look at their code, something is off. It's out of tune, out of sync. The naming of elements and levels of detail are inconsistent. Their code is so verbose that it's nearly impossible to determine or maintain context at a glance. They miss opportunities for abstraction, repeating long sections of code that perform the same functions, but on the surface appear different.

They know all about software and programs, but they don't quite get it.

Which Harry?
In many ways, writing a program is like writing a story; each of the characters must have a name that distinguishes her from the other characters. In a program the characters are items like variables and constants, classes and structures, functions and procedures. When you create any one of these, you must name it so that you can refer to it in your code. The name can be pretty much anything you like, it just needs to be unique within the context in which you're working.

For example, if you want to keep track of how many times you've done something, you can create a variable called count or counter. Each time you do the thing, you might have a line of code that says: count = count + 1.  Easy, right.

What happens if you need to simultaneously keep track of more than one count? You'll need more than one variable. However, they can't have the same name. So, you might have count_1 and count_2 or countA and countB. When you complete an iteration of taskA you write the code: countA = countA+1; when you complete an iteration of taskB, you write: taskB = taskB+1.

What happens when your programming partner also has tasks A and B and uses the same naming convention? Well, it's kind of like what happens when you have a band with three guys named Harry. You have to determine a way to distinguish Harry from Harry from Harry. You might have HarryTheDrummer and HarryTheGuitarist and HarryTheSinger. If two of the Harry's play drums, then you might have HarryTheDrummerWithRedHair and HarryTheDrummerWithBlackHair.

Anything that belonged to a harry (e.g., a counter) would include that Harry's name to distinguish it, e.g., WalletOfHarryTheDrummerWithRedHair or CounterOfHarryTheDrummerWithBlackHair.

When rehearsal's over and the Harrys go home, each returns to being just Harry. There's only one Harry in the house. At Harry's house, to distinguish Harry's wallet from Frank's, you can refer to it just as WalletOfHarry, not WalletOfHarryTheDrummerWithRedHair and of course, when Harry's by himself or talking about his wallet, he can use just MyWallet.

What happens at a battle of the bands where you never know if another band may have a HarryTheDrummerWithRedHair? How do you distinguish them? One way is to use the band's name along with the person's name. HarryTheDrummerWithRedHair becomes HarryTheDrummerWithRedHairOfTheThreeHarrysBand versus HarryTheDrummerWithRedHairOfTheTwoHarrysBand. The broader the context, the longer the name. The narrower the context, the shorter the name. In the battle of the bands, naming things could get pretty cumbersome.

What's this got to do with Apple? Keep reading.

i
In the early days of programming languages, there were no contexts for names. Every name in a program could be referenced in every part of a program. There were no homes or band rehearsals; everything was the battle of the bands. So you had to use names like HarryTheDrummerWithRedHairOfTheThreeHarrysBand and CounterOfHarryTheDrummerWithRedHairOfTheThreeHarrysBand. The problem was that computers didn't have a whole lot of memory and a name like CounterOfHarryTheDrummerWithRedHairOfTheThreeHarrysBand was untenable (not to mention prone to errors). To preserve memory, names had to be short. So, for example, CounterOfHarryTheDrummerWithRedHairOfTheThreeHarrysBand might simply become i and CounterOfHarryTheDrummerWithBlackHairOfTheThreeHarrysBand might become j. This worked pretty well as long as you knew what i meant and what j meant. If a program was small, remembering i and j wasn't too difficult. However, as programs grew, it could become problematic.

Another problem was that no matter what you called it, each program element took up memory. To save memory you would reuse elements. At one point in the program, i might be used to count snare-drum hits.  In another it might be used to count bottles of beer. This worked as long as you never tried to count snare-drum-hits per bottle-of-beer. If you did, then you needed to add another counter, say k.

So, not only did the name i tell you nothing about what i was or did, but what it was or did could change from moment to moment.

You Can Call Me Al
As program languages developed and as computers became bigger and faster, the problem of having names that were too long or too short to be manageable went away. The primary improvement (first through the introduction of structured programming and continued in the development of object-oriented programming) was the creation of structures (classes and objects) that created contexts for the use of names.

With languages such as C and Pascal, programmers could define reusable structures suitable to the needs of their programs. Each language provided basic structures or types (e.g., constants, variables, integer and decimal numbers, text strings, lists or arrays, data structures, and data types) from which other structures and types could be built. Each data structure provided a context in which names could be created without concern that they might have been used elsewhere.

For example, every single structure could have a counter named counter or a counter named i. For that matter one structure could use i as a counter and another could use i as a text string. It didn't matter, because each structure was a sovereign domain for names.

When the elements of one needed to combine values from two different structures to compute a value (e.g., the ratio of drum hits to beer bottles), one could reference both values by naming the instance of its structure (e.g., hits.count / bottles.count). The new languages made it possible to implement meaningful names that were also manageable and easy to use.

So what does all this have to do with no one at Apple understanding software? Although Apple's programming environment is based on the C++ language (developed by Bjarne Stroustrup who worked for Al Aho at Bell Labs) and although Apple's OSX is built on UNIX, the Apple development environment and coding examples provided still use names like BandsInTheBerkshires.SKABandsInGreatBarrington.ThisSKABand.ThisSkaBandDrummers.HarryTheDrummerWithRedHair.NumberOfDrumHits. 

OK, that's slightly exaggerated, but not much. Sure, the programs work. Sure, from reading a variable's name, you can tell what it's about. However, when good programmers see names that are overly long and/or filled with redundant information or information that could be garnered from the context, they respond as a musician would to an out of tune solo or as most would to nails being scraped along a chalkboard.

Context, Context, Context
There are many problems with overly long names. First they're difficult to remember and therefor use, specially when you must learn a different term for the same element based upon the structure in which it is used.

Second, their use is error-prone, specially when you have two twenty-character names that differ in only the nineteenth character. Sure, to help reduce errors and increase speed, the programming environment offers lists of names that match what you typed thus far. However, when good programmers work with well written programs, this kind of function is never needed.

Third, overly-long names make it nearly impossible to intuit how to use a structure. With well written programs, you get an immediate sense of how the programmer used names. Once you've got it, you can pretty much do everything you need to do without looking up anything. There are instances of this in the Apple environment, but they're by no means common let alone pervasive.

Fourth (and perhaps most importantly), shorter names allow you to fit more context on your screen. The more context you can see simultaneously, the shorter the names can be. If for example, you write a section of code that runs through a list of items counting the ones that are checked, and the entire section easily fits on the screen, then using the name i works. You don't need to tell anyone that i is a counter because anyone looking at the code can see the entire lifespan of i.

Seeing the semantic of context is to programming what hearing chordal structures is to music. A dominant -7 chord has a distinctive sound regardless of the key in which it's played. A sharp-9 chord is easily distinguished from an augmented-fifth chord. You don't have break the chord down into its individual elements arpeggiating  them one by one to know which chord is being played. You wouldn't want to because it would just slow you down. You hear the notes all at once, and from the color of the context, you know the chord and you know the role of each of the notes. There are musicians who hear music this way and there are people who don't.

There are programmers who can look at a section of code and, without working line-by-line through each of the details, see what the code does, just as a musician would hear a chord and immediately know its structure. For them, the most important thing is brevity and consistency. These programmers are often ten-times more productive than people who don't look at and see code in the same way. It's easy to tell which type of programmer an organization employs; you can see it in the software they produce.

I Could Go On
There are lots of other little indicators that no one at Apple understands software. I'll give you a couple of them and then I'll stop.

There's the demise of the 17" MacBook Pro. It's something lamented by every good programmer I know. There are two reasons for this. Real programmers are all about context; big screens mean more context. Sure, you could move to a tower/desktop system and use an external monitor, but then you'd have to use a mouse or track ball. That leads us to the second reason.

In my experience, there's nothing faster than a keyboard with an integrated trackpad. You can type with both hands, move the cursor, scroll and click, all without changing hand positions. To move one or the other hand away from the keyboard and back slows you down, specially when it's happening every couple of seconds. This may not seem like a big thing for your average computer user, but when your working as fast as you can to keep up with your mind's production of code, little things like this can be a real problem.

With the demise of the 17" MacBook, it would seem that no at Apple gets this, though of course it could be that they do but just don't care. From a marketing and profitability perspective, it might make perfect sense to discontinue the model. However, from a programmer's perspective, it's a disaster.

The last thing I'll reference is simply how buggy the Mac's code has become of late. I've seen lots of companies go through a process we used to call "losing control of the base". It occurs when the software core becomes overpopulated with bad code. Oftentimes each of the individual modules work, but when you bring them together, the interactions among them cause problems that are nearly impossible to trace. Microsoft lost control of their base with Vista and it took them a long time to reign it back in. Apple seems to have lost control of their base, at least large portions thereof.

That's All for Now
If you've read this far and are not Jonathan, I guess I've got one question. Why? I don't mean it factiously. It's just that I'd be amazed that anyone would be interested enough to read all this or that they wouldn't see all this as lunatic ramblings.

For me, it's been fun to write why I think no one at Apple understands software. I love coding as much as I love playing music. If I could figure out a way to do it for an audience, I might like it even better than music. It might seem a bit bizarre or silly, but this kind of stuff matters to me.  Thanks for listening.

Happy Sunday,
Teflon

6 comments:

  1. Thanks, Luke.
    It just occurred to me that were I really sharing with Jonathan my belief about no one at Apple understanding software, it probably wouldn't have taken so many words. It would have gone something like:

    I don't think anyone at Apple really gets software.

    What do mean?

    Twenty character variable names that differ in the nineteenth character.

    Really?

    Yeah, pretty much. And get this. For-loops that don't fit on a screen.

    No way.

    Yup.

    ReplyDelete
    Replies
    1. Just when I think I've buried the programmer in me, a thought like this creates a little buzz..

      Faith

      Delete
  2. Never judge a fascination by its cover. I did read the whole thing While I'm pretty sure I'd be a study in verbosity as a programmer, it was enlightening to walk through your mind and see the details behind your opinion. Thanks.

    ReplyDelete
  3. Well, Tef, I hear there was a shakeup at Apple recently, with a top-level head rolling in the software area, so maybe somebody there is waking up to this ???

    I have a nephew working in Silicon Valley, and I've put him on the job of locating a receptive ear within Apple. We'll see what comes of it ...

    ReplyDelete
  4. Hey Sree, Thanks! That would be great.

    ReplyDelete

Read, smile, think and post a message to let us know how this article inspired you...