back
Is Machine Learning Overrated or Overhyped?
Technology

Is Machine Learning Overrated or Overhyped?

By Abhishek Rungta December 19, 2016 - 2,304 views

Not a day passes without us hearing or reading something about machine learning or artificial intelligence. It is almost certainly a buzzword, vigorously lapped by journalists, analysts and even CEOs. Of course, a C-level executive of a company has to drop in these buzzwords in front of investors and others in order to seem relevant.

However, it does not take us long to realize that there is a staggering increase in the number of times we hear the words ‘machine learning’ or ‘artificial intelligence’. We need to acknowledge the fact that machine learning has been around since the World War 2, and that it continued to evolve over time. It was only limited because of the limitations of hardware. The algorithms, the concepts and the idea were all there of course.

Scott Aaronson, a theoretical computer scientist at MIT opines that the way machine learning is being discussed today is very similar to how people discussed computers in the 1950s. People discussed computers and humanoids in the same breath but they did not foresee the Internet coming. Yet, military officials and others did have inkling about Internet and how it might change the world.

Similarly, machine learning is certainly big, and it does deserve the hype it is receiving. Yet, it is overhyped because machine learning isn’t new. It is just computer algorithms that have been branded as something futuristic, while it has been there all along. In this article, let us take a look at why machine learning is overhyped and yet, it probably deserves that extra attention.

It’s been around

The history of machine learning dates back to 1959 when Arthur Samuel defined it as something that gives computers the ability to learn without being programmed explicitly. From then, it was closely associated with artificial intelligence, which is more theoretical in nature and is concerned with giving computer systems the ability to perform tasks for which human intelligence is required.

Machine learning later came to be defined as something that is practical, more than theoretical. This happened during the 1980s and 1990s when computers were developed to operate autonomously. However, machine learning is more realistic and objective in nature, and is concerned with what can actually be achieved in realistic terms as of today. Artificial intelligence on the other hand focuses more on research and changing times. With Internet of Things becoming more common today, machine learning is not as farfetched as it once seemed.

However, the way the phrase is being dropped at conventions, seminars and networking events, especially by those in the software or computer science industry makes some people a little skeptical. This is because, machine learning isn’t new, and all computers are machines and they do learn to work autonomously. That is exactly what programming is all about.

It’s got future

It doesn’t need to be repeated that programming, artificial intelligence, Internet and everything that we know related to computer science and information will continue to grow. When someone repeatedly discusses machine learning as if it were something new and special, it gives a sense of trying too hard to prove something. While machine learning is an established scientific field, the over usage of the term has led some to believe it is a new science that is going to change things quite dramatically.

On the contrary, machine learning has always been around in the last few decades, and its application will be quite widespread. The reason for machine learning’s wider growth and application will probably because of IoT growing in nature and stature. As more objects will become ‘smart’, more machines can be ‘taught’ to interact with humans and computers in a more advanced manner. This is probably a more realistic approach to understanding machine learning than discuss it as if it were a fad. Machine learning isn’t a fad but it is an established field of computer science.

Be wary of over usage

Machine learning, like all other tech buzzwords, seems very appealing. Once you begin to read the many articles that are churned glorifying it, you might end up using the word too, all too often. It is neither impressive nor necessary to use buzz words in order to sound informed. In fact, using the word all too often may actually harm those who are really working at machine learning.

Machine learning in reality is quite a tedious and difficult science that involves lot of programming and coding. It is not half as glamorous as a journalist might make it seem to be. Yet, it is an extremely important part of computer science and it is certainly changing the world as we know it. It is important to remember that all fields of science change the world as we know it.

Even something as basic as neuroscience is responsible for a better understanding about how humans work and think, leading to improved treatments. Yet, nobody discusses neuroscience as they would discuss machine learning, though it is advancing at probably the same rate. Certainly, one might want to be wary of hype over machine learning. However, the field itself is very promising and will continue to evolve, just as another field of science.

Avoid the hype, embrace the facts

There is a lot of hype around different technologies today. It s important to remember that no particular technology is superior to another and that every field has its role to play in the development of society. Similarly, machine learning has been around for decades and it is only now that it is receiving a lot of attention.

This neither means that it doesn’t deserve the attention that it is getting nor does it mean that the hype is right. The hype can be translated to mean that machine learning is very relevant today than it was yesterday. And most certainly, it is going to remain as important as it seems today. On that note, it probably deserves the hype that it is receiving.

Page Scrolled