Common myths about AI


Introduction

Although AI is becoming more mainstream fast, for a lot of people, there is still a haze of mystery around it.

There are also a lot of hardcore myths about AI.

Below we will debunk four common ones.

1. Do I need a PHD?

Do I need a PHD for AI?

In essence, AI is mostly math. However, that does not mean it has to be scary, remember the basic building blocks consist of elementary mathematical operations.

The answer to this questions is:

It depends.

Not for the sake of AI, but for any science field.

It depends on what you want to do.

Regardless of what you often read, AI is just another technology. To understand and apply most AI techniques to your problem today, you definitely do not need a phd. Much like you do not need a phd in electrical engineering to use a smartphone.

There are a lot of resources, frameworks and even pretrained models out there to get you going, and more and more tools and platforms are making it even easier to get started.

If, however, you want to get into research (academic or R&D), much like in any field, a PhD might be a good idea or even a requirement, but it is definitely not a prerequisite to apply and benefit from AI today.

2. Do I need a lot of data?

Do I need a lot of data for AI?

Deep learning networks get more accurate the more data they see, but that does not mean you can only apply them if you have an endless stream of data.

While a lot of research is still training networks from scratch using large datasets like ImageNet to benchmark new methods, almost all practitioners use a technique called “Transfer Learning”, which we will discuss more in depth in a later post. With transfer learning, you use an already trained network as a starting point, and tweak it to your data.

This means, you can train a network:

  • to great results,
  • on your own data,
  • by using a fraction of the data samples.

3. Do I need a lot of GPUs?

Do I need a lot of compute for AI?

Most news about AI is about bigger networks, with more data, that have been trained for hours/days/weeks on 100’s or even 1000’s of CPUs and GPUs. However, for 99% of AI practitioners, who are implementing practical AI solutions and using transfer learning, one or two GPUs with a decent CPU should get the job done.

Obviously, the required compute power will highly depend on the amount of data you will use, but even if you have a lot of data, start with a limited dataset and small compute before scaling up. (We will discuss some practical strategies in a later post.)

4. Will AI replace domain experts?

Will AI replace domain experts?

From experience it is usually the opposite.

Deep learning is really good in learning to map inputs to outputs but we still need people who understand the domain very well, to get the correct data to learn from and to see if the results being produced are actually any good, and whether the data is representative for the real world.

When creating AI models for your own use case, you will spend a lot (if not most) time making sure the data quality is up to par, and for that, you will defnitely require experts in whichever domain you are working in.

In Essence

There are a lot of myths about AI making their way around the world, but as in many cases, these stem mostly from a lack of understanding and practical experience.

Remember, although AI might be destined for a profound impact, it is still just a technology and just math underneath…

Where to next

This post is part of our “Artificial Intelligence - A Practical Primer” series.

Or have a look at: