Your First Language

JavaScript? Python? Or should I just stay out of it all?

February 13, 2020

I have been programming for about a decade now. I have (obviously) learned a fair bit through this time, but perhaps the question I am most frequently asked is this:

What language should I learn first?

This question almost never comes as a surprise to me. Usually, anyone who has expressed any interest in computer science to me will inquire about this sooner or later. Answering this question is actually quite difficult for me—I did not learn programming in an orthodox way.

Is it for me?

First, I think it is important that if you are to learn how to program, you need to have a natural mindset. I think this thinking conflicts with the missions of a lot of academic coding corporations. Public schools are encouraging everyone to learn to code, at least once.

While I agree with this philosophy to some extent, being able to think logically and architect code naturally to accomplish a task is undoubtedly a skill you naturally possess. I’ve realized over time that some people just don’t have the mindset required to program.

Now it is important that I’m not mistaken for someone that thinks "being able to program is a higher-intellect skill," but rather just that programming can be a field selective to people actually interested in (and built for) it. Just as people interested in (and talented in working with) business require a specific interest and mindset, the same criteria should apply for computer science. Not everyone is designed to work with computers.

On the other hand, basic programming knowledge can be fundamental to work with computers on a long-term basis. There are certain automation techniques I see being useful if you’re working with computers from nine to five. A great example of this is the variety of Excel-like tools that exist. Formulas, as Microsoft has coined them, are a declarative form of "programming language"—really more just of a declarative chain of "function calls" to accomplish a basic task. The thinking required to create the path from idea to execution is, in essence, extremely similar to the thinking required to program.

In short, I think the mindset you might exhibit working with automation and logic on computers apart from programming is extremely similar to programming. Both mindsets are highly logical. In conclusion, it is logical to say that this mindset is naturally present. Teaching code (from ground zero) isn’t always possible.

Okay, okay. But which one?

It depends who you ask. Many public schools, including mine, teach JavaScript as a "first language." However, many college institutions will actually teach Python as a first language, or sometimes even C or C++.

My public school teaches the CS curriculum through, which teaches (what appears to be) a slightly modified and sandboxed version of browser JavaScript. While this "dialect" of JS can be a beneficial first language, I felt that while following the introductions to code, the curriculum failed to teachone of the primary aspects of programming: the logic required to write code.

Realistically, when people ask me The Question™, my go-to response is "I don’t know." Truthfully, it's more important to consider your ability to use logic than to question what your first programming language will be.

Most programming languages function in a fundamentally similar way. They accept user-typed characters, and translate known syntax into machine language. This is true for almost every programming language we consider when the question is asked. But as I mentioned before, the absolute most important thing to realize when learning to code is understanding that you do a lot more thinking than you do typing, which is completely independent from the language you are using.

Introductory code classes

Again, this is why I think’s curriculum is lacking. The initial "coding" they have students do is to command a turtle to do basic things (e.g. move forward, turn left, etc.). While one could argue that the progression of lessons led up to teaching what is essentially "programming logic," I still felt like my peers found themselves more stumped on JavaScript syntax than on using logic required to solve a problem, for one specific reason. will often give you solutions to a problem, be it through code or through some basic English "pseudocode." That is, a lesson might say

Accept a number of the user's choosing.
Then, if the number is greater than five, tell them their number is too big.
Otherwise, don’t do anything at all.

While this example is not the most applicable (it doesn’t really have a start and end goal), almost every prompt ends up doing this. Moreover, this solution isn’t even a last resort—the prompt actually wants you to take instruction from it.

Thus, this type of end goal proposition just causes my peers to be more confused by JavaScript’s (poorly explained by syntax than to actually feel challenged by thinking and producing instructions logically.

Thanks,! I’m definitely not going to copy-paste this. I’ll just make "something like this." presents programming mechanics in a way that simply states the end action, and shows how to accomplish it. There is no explanation on specific details, just that it does something. Functions are first introduced when has the user calling moveForward(); on the turtle. No explanation is done on why the function call looks the way it does. Of course, the curriculum eventually teaches function declaration, but there is still no explanation on syntax. If I were learning how to code again, I think a thorough explanation of the syntax would be extremely helpful. It is very difficult, however, to teach someone how to have a logical mindset. I am convinced my logic comes purely from experience and natural interest.


If we look past my poor experience with taking an "online code class," we can move on to the expected response of The Question™. Up until this point, we’ve only spoken of JavaScript as an introductory language.

I analyzed ten different colleges and the languages they teach for introductory classes.


Interestingly, it is Python that is most common, accounting for 50% of languages taught by introductory classes. The only class teaching JavaScript is Harvard's introductory class, which teaches two other languages.

This data seems to inherently question's choice of teaching JavaScript for newcomers to computer science. Why is JavaScript taught as a first language?


JavaScript is an interpreted language designed for the web. By no means was it ever intended to be used for anything beyond that, but some people have taken it too far. It is a loosely typed language with syntax that some people claim as C-like (anything goes, really) and a very interesting type system.

Personally, I believe JavaScript induces bad programming habits. Its dynamic type system is almost too dynamic—. Its prototype system is ridiculously questionable (looking up to Lua for the better alternative here), and serves as a poor attempt at flexible object-oriented programming. Asynchronous code is an absolute nightmare with JavaScript. While Promises mitigate this to some extent, they still aren't perfect solutions. Every helpful addition or syntactic sugar to JavaScript feels wrong.

On the bright side, JavaScript's syntax is representative of C-like languages. And it's quick to learn if you're already familiar with programming, but you'll probably end up hating it.

That's just about all I can give JavaScript positively. I have (proudly) written no JavaScript for this website to this point. The only JavaScript active on this page is a syntax highlighter that I did not write.

In conclusion, based on my experience, JavaScript is not a good starting language. Its type system and API induce bad habits. But on a more general note, what are you even going to make with JavaScript? The only reasonable response to that question is "a website" or "a web app." Even then, alternatives exist to making web apps work, like Elm and Mint.

node.js exists as a complete standalone JS runtime to create (get this) standalone JavaScript applications... off the browser. Can you imagine that? What's more—some people even use a combination of node.js and Chromium to make "native web apps", like Discord. For bonus research, look into React and Angular. But don't let me give you any ideas.

While you're at it, actually, I encourage you to browse packages on node's package manager, npm. Look into the source for yourself.

Anyways, I digress.


This language is most popular among college introductory classes. One of the most used languages, Python is an interpreted language with a much better type system than JavaScript.

Its syntax, which looks similar to YAML, largely requires indentation. It is not representative of C-like languages, which is unique from the other four languages in the table, JS, Java, and C/C++. Initially, I thought Python's syntax was sort of unsightly. With time, I've grown to enjoy reading Python. It might not be too similar to other languages, but its simplicity and usability makes the language very accessible for beginners.

Python is presently third on the TIOBE Index. Its usage is steady, especially in data science.

There is not much I can negatively say about Python, except for one general complaint with interpreted languages.

Interpreted & dynamic languages

An interpreted language is one in which its source is typically processed into tokens (keywords) at "compile time," but is read and executed by the "interpreter" in real time. Thus, more code can be executed after the program has been run.

Conversely, a compiled language is a language whose source is compiled by the "compiler" into native machine code at compile time. This way, no new code can be executed at run time. Compiled languages are generally much quicker than interpreted languages, as the computer has to do significantly less thinking to run code that is already translated into the language understood by the machine.

In terms of computer science, I'd personally recommend a compiled language to be taught first. Interpreted languages have extremely specific criteria in which they should be used for a project.

JavaScript is interpreted to be nice on errors and quick to type. It is run in real time by the browser, so it makes sense for JavaScript to be interpreted. It is dynamically typed, meaning a variable's type does not have to be explicitly annotated; the interpreter can infer its type. Likewise, Python is also interpreted, and is also dynamically typed.

Languages like C, C++, and Java are compiled and statically typed—opposite JavaScript and Python. A statically typed language is one where type declaration is usually necessary (with the exception of type inference in some languages). The computer has no idea what type a variable should be until you indicate it. Thus, it is unable to reassign existing variables of one type (say, a string) to another type (like a number).

Statically typed languages are arguably safer. With the requirement of type annotation, you can save a lot of time ensuring a variable has a value, or has the correct type of value.

Let's compare similar snippets from two languages, one dynamically typed, and one statically typed.

// Let's make a few variables.
var x = 4;
var y = 3;

// Maybe a function to add them.
function add(a, b) {
	return a + b;

// Let's test it!
add(x, y); // -> 7

// But what if we change the variables' types?
var x = "4";
var y = 3;

add(x, y); // -> 43
// Oh no! We're getting weird now. Also, what is the type of the return value? We can't specify the types of anything naturally.
// Let's make a few variables, declaring their type explicitly.
int x = 4;
int y = 3;

// Maybe a function to add them, again, declaring return type and parameter type.
int add(int a, int b) {
	return a + b;

// Let's test it!
add(x, y); // -> 7
// Nice, expected behavior.

// Let's change the variable types.
int x = "4"; // Whoops, we can't do this. "4" can't be cast (converted) to an integer.
// Okay, what about...
string x = "4"; // Nope. x is already defined in this context and already has a type.

The behavior the statically typed language exhibits is recognizable by a linter (a tool that runs before code is executed to find errors), while dynamically typed languages usually won't exhibit odd behavior until trying this at runtime. Therefore, statically typed languages are naturally safer.

What a first language should be

In my personal opinion a C-like language is the optimal "first language." Using an extremely low-level and primitive language like C is excellent to dip your toes into the waters of programming. It is statically typed, extremely fast, and does almost nothing to hold your hand—you are responsible for carving your path.

There are no helper methods; there isn't even a string type. There is no garbage collection. There is no luxury for newcomers.

C should be taught at a very basic level, iterating over the basics of types, memory management, and application development. Only then should the said luxuries be revealed to the student. It makes more sense to start with a more difficult and less lenient language. It forces the student to think like the computer, and to understand why a "luxurious" language like Python is so much more simple to type than C.

As mentioned, C has no helper methods. Languages like JavaScript and Python have extremely helpful methods like splitting strings, reversing arrays, and other simple logical tasks. The lack of these methods in C is an excellent way to force students to think logically. Curriculums could then pose challenges to students, requiring them to replicate these luxuries. Teaching a luxurious language encourages students to become accustomed to these helper methods. It is important that students understand why some array partitioning method works, so that they will not take it for granted when moving to another language.

Moreover, C is 2nd on the TIOBE Index. It is still an extremely used language. It has inspired a tremendous amount of languages after it. Why else would so many people claim they want to learn Latin first to learn the origins of words in so many languages? It might not be fun to type compared to other languages, but the pain and suffering C induces is truly beneficial long-term.

Or, you choose

If you feel you're logically capable and ready to code, teaching yourself isn't out of the question at all. Pick a language that looks enjoyable to you. Maybe think of an end goal, something you'd like to produce over a period of time. Read into the type of application it would be, and pick the most appropriate language.

Into web development? Ignore what I've said and teach yourself some JavaScript. Game development? Maybe C# is the way to go if you'll be using Unity. Artificial intelligence? Use Python, TensorFlow is a great platform.

All in all, what language you choose to learn first isn't the most important. Understanding how a computer is going to interpret your instructions is absolutely critical to writing any code. While I personally believe generic code curriculums (disregarding college courses) are bland and saturated with typed code, maybe you'll find one that helps.

So please, if you're thinking about programming, understand that this question is not very logical. There is a deeper understanding required to learn how to program. Choosing a language is just a minor step in the process.