I wrote a book on improving your dev skills and landing a job! Learn more →

How to learn programming

Mads Brodt • December, 2021

3 min read

A few days ago I posted a tweet on which programming languages I think would be best suited to teach programming at universities:

I'm really confused why Python and JavaScript aren't the main programming languages used to teach programming at university level. Yes, they're higher level than Java or C# - but they're also the easiest way to inspire students by allowing them to create cool stuff. Just me?

The tweet blew up, with lots of interesting discussion from both sides. Some people agreeing with me that Python and JS should be taught first, because they're easier for beginners to understand and can help motivate new programmers by getting them productive quickly. Others explained how learning fundamentals like types and object-oriented programming with a more traditional language like Java creates more well-rounded software engineers.

I definitely see the arguments for both. When you study something like Computer Science at the university, you're not studying to become a front-end developer - so just focusing on web technologies isn't a good idea. You also need to learn about algorithms, lower level languages like C and how compilers work.

At the same time, I think starting with these very complicated topics is a surefire way to lose a lot of people that could become excellent developers if they'd had a more practical introduction to programming. I know at least for me, I was very close to giving up after my first year studying a bachelor's in Software Development. It felt like all I could do was create console/terminal based applications with Java. And that simply wasn't exciting for me.

It wasn't until I discovered JavaScript, and how easy it suddenly was to get cool stuff happening on the screen, that I discovered my love for programming. And I think this motivation is absolutely key to keep people interested in development long enough to get them hooked.

That said, I think every single developer (regardless if they've studied at university or are self taught) owe it to themselves to learn some computer science fundamentals. Operating systems, software architecture, networking, databases etc. are all insanely valuable topics that uncover the "magic" going on below the higher level languages like Python or JS. I'm not suggesting removing those topics at all. I just don't think they're the right place to start.

I definitely recommend reading through the above Twitter thread for some great arguments on both sides. And if you're one of the people who feel like you might be lacking some CS fundamentals (because you're self taught or just didn't understand it properly during your studies, like me), check out this free online curriculum called Teach Yourself CS

I'm Mads Brodt — a developer, author, teacher, creator and blogger. To keep up with all of my writing, follow me on Twitter or sign up with your email above 👆

You might also like...