Chapter 4

Programming

Ada Lovelace Ada Lovelace

Subsections of Programming

Introduction

YouTube Video

Resources

Video Script

In this video, we’re going to take a look at computer programming languages and some of the history that’s behind them. First off, what exactly is a language. Something that we don’t really think about very often though. A language really is just a set of symbols, whether it be gestures, words, sign language, or even Braille, that are used in a uniform fashion to allow people to communicate with one another. We have to take a look at how we as humans can communicate with a physical or inanimate object: a computer. In order for us to tell computers what to do, we had to develop a language that we could use to communicate with it. Here on my slide I have on the left a bunch of different ways on how to say good morning, in various human languages that we use to communicate with with one another. And on the right side here we have a bunch of computer languages that we have. So the first one up here is happens to be Python 2, Python 3, C, Java, JavaScript, and even Rust. And so this is just a small number of different things that we would be able to use to communicate this particular message to our computer. Just likewise, we have a lot of different ways we could say good morning to someone.

Let’s take a look though on where the idea of a computer language actually started to originate from. Ada Lovelace is our first influential woman in computer science that we’ll talk about. She is regarded as one of the first figures in history to truly understand computer programming, possibly only second to Charles Babbage. Ada Lovelace was the daughter of a famous English poet. Lord Byron, who I’m sure you’ve heard about before, but she had very little contact with her dad pretty much at all, and during her childhood, she was pretty much on her own. But during that period due to her father status, she was tutored by some of the greatest mathematicians of that time, including Augustus de Morgan, remember from the Boolean logic lecture, had the De Morgan’s theorem. And so she was already being tutored by some of these great mathematicians who were laying the groundwork and the foundations for what we use in computer science. One of the… one of the greatest quotes from from Charles Babbage, who he was talking about Ada Lovelace with. So, Charles Babbage said that Ada Lovelace was that “Enchantress who was thrown her magical spell around the most abstract of sciences and has grasped it with a force few masculine intellects could ever have exerted over it.” So this is a really big compliment from Charles Babbage, who was pretty much the pioneer in computer science at the time.

Ada Lovelace spent a lot of time visiting Charles Babbage and she was so intrigued by his Difference Engine that he had in his house, the prototype that he had made. But her goal in visiting with Charles Babbage so much is that she started to translate his work into English so she could bring it back to England, and explain how it worked and why it was so important and revolutionary. And to assist with that she included a set of notes with her own descriptions of the design of his Difference and Analytical Engines and how they actually worked and when completed her notes are actually longer than Charles Babbage’s memoir. She has so she had a very deep understanding on how Charles Babbage’s machines actually work.

What makes Ada Lovelace so unique in the history of computing is Section G here in her notes. in that section she describes in complete and utter detail how you would use Babbage’s analytical engine to calculate a sequence of numbers called the Bernoulli numbers. You’re probably also familiar with. But while it appears to be written in English, it’s actually designed in a way that can be directly used by the Analytical Engine. So, in a way, you could say that it is in a language that the analytical engine would understand. And in doing so, she wrote the first computer program, which is simply, right, a series of steps specifically designed in such a way that could be easily ran on a computer.

There is some debate as to whether or not she wrote some of the programs she was previously attributed to. But there’s little doubt in the fact that she would be capable of doing so. In fact, she was one of the few people who saw the true potential of what Babbage had actually created. And she’d once remarked that under the correct circumstances, it could be used to create music. So the Analytical Engine she was talking about here. And that’s really quite extraordinary, something that Babbage probably hadn’t thought of at the time. But if you are a musicphile at all, music, right is just simple mathematics. And so if you have a machine that can compute mathematical values, why wouldn’t it be used to create music? So all told, Ada Lovelace is really truly a very important person in the history of computing science. And she’s widely regarded as the world’s first true computer programmer. And as a as another bonus side note here, she’s actually credited with discovering the first bug in a computer program as well. A bug. In fact that was found in a program that was written by Charles Babbage.

Grace Hopper

YouTube Video

Resources

Video Script

Our next famous woman in computer science that we’ll be talking about is Rear Admiral Grace Hopper. Grace Hopper enlisted in the Navy and worked on the Harvard Mark I computer, as well as the UNIVAC, a successor to the ENIAC, and we’ll talk about the UNIVAC and ENIAC computers in another lecture. So grace also developed a programming language called FlowMatic which was later adapted into a language called COBOL, which is still somewhat used today. COBOL is traditionally a language that was used very heavily in the financial industry, because it was used to generate a lot of financial reports. You don’t see it a whole lot now, you may see it used occasionally at places like banks, or accounting firms and things like that, but it’s not very widely used anymore.

But beyond that, she was a truly a pioneer and out of the box thinker for her entire life. And as you’ll see in the following videos, Grace Hopper is a pretty sassy lady but a pioneer in modern computing. She was also pretty famous for her description of a nanosecond in the Dave letterman show, which you’ll see here in just a little bit. Another thing here every year, there is a Grace Hopper conference to celebrate diversity and inclusion and women in computing. And so there’s actually scholarships available for students if you would like to attend this conference. And it’s something that I would highly recommend if you ever get the chance, it is truly a great experience to actually participate in.

Nano Second

Discussion of a nanosecond starts at 4:17

YouTube Video

Margaret Hamilton

YouTube Video

Resources

Video Script

The next influential woman in computer science that we’ll talk about here is Margaret Hamilton. Margaret was the director of software engineering at MIT’s instrumentation lab. Margaret Hamilton was also known as the, essentially the creator of the field of software engineering, essentially. But the reason why she is credited for software engineering is that she was responsible for a team that developed onboard flight software for the Apollo space program. During this timeframe, her program actually prevented a mission abort during the Apollo 11 moon landing, which is kind of crazy cool to think about right.

In this picture here is, she’s actually standing next to the output of one of the programs that they had actually made. In the video here you’ll see that, you know, programming at this time was completely by the seat of your pants, right? There was no really step by step procedure or best practice of programming pretty much anything. And so the idea that NASA trusted this group of engineers to create the flight software for the Apollo space program was really kind of crazy to think about. And so if you had actually messed up, right, the output of your program would just be you know, feet high instead of just a few pages long, which is this famous picture with Margaret Hamilton here in the slide.

What is Programming?

YouTube Video

Resources

Video Script

So what really is programming? Right? We’ve talked about a lot of famous women in computer science so far, that have been very influential in the first computer programmer. Right. So Ada Lovelace, the queen of software, Grace Hopper, and Margaret Hamilton, who really pioneered the idea of software engineering, best practices with computer programming. But, what does that really actually entail? Well, programming itself, right computer programming, is simply the act of designing, writing, debugging and maintaining the source code of a computer program.

Let’s imagine that we’re writing a program, a very simple program, that takes input from the user, and then prints out a result that is divided by 61. So the user inputs a number and then we divide it by 61 and output it. So what would that program actually look like? Well, in Scratch, right, it would look something like this. This programming language is very English like, right? It’s a block based programming language that is designed to teach young kids, even up through middle school on basically how to program. And it makes it a lot easier for humans to understand. Because even if I showed this to my grandma, she could probably deduce kind of what’s going on here. Ask for a number and wait, and then divide whatever the answer to that question was by 61. And stop, right? Simple enough. So, very easy for a human to understand.

But do you really think that a computer is able to understand this language? Probably not, right? My computer has no idea what the idea of a block is, right? Computers really only only care about ones and zeros. That’s it. So how do we get from a block down to something a computer can actually understand? So that’s really going to highlight our language hierarchy here. So our high level languages, so the stuff that humans can understand, so stuff that we can understand is what we pretty much deal with on a daily basis as computer scientists. But, then we can kind of go down our stack right into the assembly, machine code, and then down on the hardware, where the actual physical hardware could actually interpret what we’re trying to talk about. But first off, let’s take a look at more high level languages.

So we’ve already showcased Scratch, which is a very human readable. But here is a program written in C or C++. And we’re now starting to get into some stuff that is not so human readable. This in general requires a little bit of training in order to understand, right? Just like if you don’t know Spanish or French, if you are trying to read something in that, in that different language, you need a little bit of training to figure out what was going on. So we have a lot of extra syntax here like printf, right? What does that mean? What is a float? If I, if this is the first time ever reading a programming language, I’d have no idea what some of these things actually are. But again, right, this is the same program: input a number divide it by 61. And then output that result.

Same thing goes with Java, right? As you can see here, we have a lot more, a lot more fluff, right. So we have a bunch of import statements up here. We have we now have to have a class and we have to have a public static void main string args, whatever the heck that’s supposed to mean, right? And even beginner Java programmers aren’t even introduced to this concept until once you get deeper into that particular course, What’s a scanner? What are we scanning? I don’t know. So again, right requires a little bit of extra training and Java, with it being an object oriented programming language, adds a lot more fluff and syntax and wrapping to our program just for a simple program like this in order for it to work.

And C# isn’t much better. C# is essentially Microsoft’s flavor of Java, but does the same thing right: inputs a number, divides by 61 and outputs it. I would argue that this code here is a little bit easier to understand. But again, right, we have a bunch of extra fluff here that you know, start from scratch is a little bit more difficult to understand without a little extra training.

Now Python, like Scratch, is generally touted very human readable. So it doesn’t have all of that extra fluff, right? We have no class, or namespace or inputs, or sorry, imports to define here. We just take input, and then convert it to an int, store it and then print out the result, right? Simple enough. So all of this extra fluff that we had here, in C#, I’ve reduced all of this down to two lines in Python. That’s why Python is traditionally used as a scientific language because it’s a lot more accessible to non-computer scientists.

But again, right, there’s tons of different programming languages out there and they all have their purpose. They all have their place. Some more human readable than others, right? We’ve talked about COBOL already right with Grace Hopper, Pascal, Visual Basic Perl, fortran which is an old one as well. Basic is another older programming language. There’s way too many of them to list here. But the point is we have a lot of different languages with the various different purposes, but all of them with a trained computer programmer, could use them to communicate their particular idea or task or whatever they’re trying to do to the computer. But again, right, all of those languages can’t be understood by the computer directly.

The majority of those languages will have then a compiler associated with them. And compiler or assembly is typically a language that is assembly language is typically still readable by humans, but is much closer to the language a computer would actually understand. And assembly language requires much more training to actually understand what’s going on. And so to go from a high level language to assembly language, we’ll use a program called a compiler, which in its entire purpose is to act as a translator. So the language that we use to write our program and like Python, or Java, let’s say, is then compiled and translated down into an assembly language, which is closer to what the computer can understand. Here’s our C program, the assembly language that’s associated with that after it’s been compiled. This particular part is simply, right, taking a number and dividing it by 61. And this is, this is a heck of a lot more complex, isn’t it? Right? You will eventually, or if you, if you take some of our upper level computer science courses, you will learn how to translate all of this and understand what’s going on here. But the majority of computer scientists aren’t going to need to understand assembly language unless you’re dealing more with things at the operating system level or hardware level and things like that, or even cybersecurity if you’re into cybersecurity, being able to understand some assembly is quite useful. Not directly interpretable by your hardware quite yet.

Once we have our assembly language that is going to be then fed into the assembler, the assembler is going to take the assembly language and convert it into machine code, which is then directly read by your computer. The assembly and things like that. Some of this will actually depend on the operating system, the architecture and things like that, that you’re actually running on your computer. While the high level language is uniform, mostly across all machines, the assembly code is not. Likewise, machine language is very specific to the machine that it’s being built for. So, machine language is just simply a set of binary code, all ones and zeros, that tell the computer exactly what to do. And so here’s the machine code for the assembly language that we just saw previously. Theoretically, right? You could translate all these ones and zeros back into the original program. But that’s pretty difficult to do, right? It requires a significant amount of training to be able to try to translate these ones and zeros back into what it was originally.

And so now we’re getting into code that can be interpreted directly by your machines, hardware. Your hardware, really all it cares about is current or no current, right? Electricity is on or the electricity is off. And so this simply refers to the structure of chips and circuits that your computer is made of. And we’ll talk more about that in another lecture. But, and the gist of it here, right, we’re feeding all of our ones and zeros into each of the inputs of our hardware, which is taking those ones and zeros, doing some sort of calculation and outputting the result back to your computer.

But overall, right this is kind of the big picture of computer programming, right? We go from a high level language, where we have things like Python, C, C#, Java, JavaScript, all of those languages that you would normally program in, as a, as a software engineer. And those languages are then compiled or taken by a compiler and compiled into assembly language, which is more machine specific, right, depending on operating system, Mac, Apple, Linux, that sort of thing. 64 bit versus 32 bit processor, lots of different things that go into that particular compilation. But that assembly language is then assembled, taken by the assembler and then converted into machine language which is just the ones and zeros that your hardware is going to interpret

Pattern on the Stone Reading

Read Pattern on the Stone, Chapter 3.