A number of years ago I was listening to Keith Jarret with my dad. Keith Jarret is an incredibly versatile and prolific pianist, famous for jazz improvisation and, later on in his career, for his interpretations of many classical music pieces (Keith Jarret on Spotify). Without question, at least in my opinion, he’s one of the top pianists in the world. Well, my dad and I were listing to the album '“The Celestial Hawk” (link withheld on purpose) and my dad made a statement that has stuck with me ever since - “There’s Keith Jarret I like, and there’s Keith Jarret I don’t like. This is one I don’t like”. Not only did I find that statement hilarious, but while I totally agree with him, it’s an interesting view point. Here’s a musician I have a ton of respect for, but just because he is one of my all time favorites does not mean I like or agree with everything he has done in his career. I don’t sit down with an album and pretend that this is the best thing I have ever heard. I listen, I try to understand it, and I’m not afraid to be critical if I don’t like it or share it with my friends if I think it is the best thing ever. I’m not going to spend time trying to convince you that it’s not a great album, maybe you will like it, maybe not. It does not bother me if you have different preferences than I do and I would prefer to discuss it from that view point than to waste time arguing about who is right and who is wrong. There is no right answer.
I treat technology the same way. I mean, to say that this has been a weird year with respect to Artificial Intelligence would be an understatement. We’ve been hearing rumors about Large Language Models or Foundation Models in AI for a few years now but this year year we have witnessed the commercialization of these systems. This has not just reframed the way people work but has also brought to light a lot of the consequences and even fear of these systems. There is no doubt that these systems are useful, but listening to some of the bandwagon brainwashed, you’d think that we should all just put away our keyboards and let this new technology do everything for us. They tell us that these things can write code for us, are better at managing our health problems than we are, can pass bar exams, and can even write articles/movies/music, pretty much everything. But it’s naive right? I mean seriously, if you don’t understand how to prompt it, you are not going to get very good results. You can certainly get some bits and pieces, but you still really need a professional to decide if it’s right or wrong. Even the people that build these systems don’t really understand how they work under the hood, it’s really just making predictions based on content it has been trained on. They take the context of the question you ask, predict what the intent of the question is and generate a sequence of words that resemble the way that we write text. In the end, its just a lot of probability, math and powerful computers. Is it cool, sure, is it smart? can it think? no, not really, at least not yet. For me, this is no different than using Google to look something up, or reading academic articles, or even books. We know all of these sources are biased and we know that there are authors/sources/media that we trust more than others but we read it all, think about it, and form an opinion. It’s this last bit that really worries me.
Setting AI aside for a moment. One of my biggest concerns lately is is how people seem to have stopped thinking critically. Sure everyone has an opinion, but often they are just echoing something that they have heard somewhere else without really thinking about it. In tech you see it everywhere: “Mac vs PCs”, “JavaScript vs Python”, “Intel vs AMD”, “AWS, Azure, GCP or OCI" and not only do we waste a ton of time trying to convince each other that the one we like is the best, it’s often done blindly. Let’s say you have a choice between Windows, Mac or Linux - which one are you going to pick? If you picked one already, you are wrong! You don’t know the answer yet because you don’t know what the system will be used for. I mean, you can certainly start the discussion by announcing your bias towards Apple systems but if I then turn around and say that the application is machine learning you might change your tune. If you double down and say that it has to be Apple, Apple is the best, no other operating system is worth considering then you are just part of the problem. Yes, you can jump through some hoops and do machine learning on Apple systems, but there are a lot of good reason to go with Linux. Of course, I still have not mentioned what type of machine learning we are going to do or who is going to be doing it. Again, this opens up a host of new questions and considerations about which system is best suited for the job. There’s a reason all of these systems exist, they all have their place and depending on the work you do, some of them are better fits than others.
Even worse, it now seems that there are a lot of people (or maybe they are AIs?) that are attacking different technologies in an attempt to defame them. Again, it’s annoying and it wastes a lot of time. You are much, much, MUCH better off just doing research and learning about these systems on your own than arguing with people (or bots). These attacks come from a few different directions:
obvious click bait like “Agile is Dead!”. Agile is a methodology and a framework for organizing your work. Just because it does not work for you or your team does not mean that it’s not going to work for me and my team. You have to actually think about a number of different factors to decide how to run a team, there is not a single solution that works well for any team, regardless of size or technical capabilities.
the tech holy wars - “Javascript runs the internet!”. So if I’m not using Javascript everywhere am I doing it wrong? Obviously not. Javascript is just another language. If I’m building machine learning models, I’m not going to attempt to do it in Javascript, just like, if I’m going to build a web application, I’m not going to attempt to do it in Python. Use the language that makes the most sense to you, just because everyone else is doing it does not mean you have to do it to. Sometimes everyone else is wrong.
the smear campaigns - “Apple hires 8 month old babies to work in their factories!”. It’s cool if you want to make your tech decisions based on environmental impact, on humanitarian grounds or things you strongly believe in. Just make sure you know what’s really going on before you jump on that bandwagon. Often, you are picking the lesser of two evils or there is a single instance of a subcontractor that was hired by a supplier that did something egregious that the actual manufacturer could not possibly know about. Often they go and fix it when they find out. I’m not defending anyone here - just make sure you have all of your facts straight before you make a choice.
My co-workers are going to kill me for this but “back in the good ol’ days” (like in the early/mid 90s), internet culture in tech was very different from what it is today. We would write a lot of open source software, share ideas and have constructive debates about how to make things better. If you used someone else’s software, you would would patch it for your own systems and then send the patches back in so that other people could take advantage of them. Yeah, sure, we had tech holy wars like “vi (vim) vs emacs” - but no one really cared which one you used, just pick the one that makes the most sense to you.
A solid approach to tech is to keep a very open mind and learn as much as you can about how these systems work. Try to understand how they were built and what types of problems the original architects and designers were trying to solve. I will often look at some new technology and see a good idea and figure out how to leverage it into one of my own systems. Don’t just follow the crowd, try to understand what the crowd is doing and do something better/different or even totally opposite. Don’t lock yourself in but also don’t be scared to lock yourself in as most of the work you do is actually in the learning phase and not the implementation - a bulk of the work is dreaming up the design, the implementation should be easier and once you have implemented it once, it should be easy to implement it again in another language or in a different cloud. If you can do this effectively, you will have an easy time deciding “what you like and what you don’t like”, and just spend all of your energy growing the tech and ideas you like.