Extra funding from companies and profound interest by universities has led the Artificial Intelligence, also known as AI, algorithms to become more agile and diverse. This has given businesses opportunities to automate the processes and tasks that are currently done by repetitive manual labour of the employees, ranging from automatic invoice replies to article generation or stock trading. With such promising implementations and developments of the AI, you might ask why it has not yet “taken over” the industry and what is slowing it down?
One of the main issues arises from the diversity of the tasks. This is also where the bias first surfaces – most problems are different, so the training data, the algorithm in use, the computing costs, the testing options – everything needs to be considered for a functional solution. If an AI was trained to detect trees, it would still be unable to detect flowers, but if made to detect all shown objects, then the testing data would become much more difficult to prepare. What this means is that most AI solutions are developed to perform their specific purpose. The narrower the purpose, the better performing and cheaper to implement the AI is. Therefore, most AIs are presented with a bias in their training data – being able to do the task they have been trained for, but unable to perform in different situations. This holds back the potential of the AI, since each solution needs to be implemented separately, increasing the time and money costs.
Another reason for slow AI implementations in businesses comes directly from bias. This can be seen exceptionally well in medicine – an AI must be able to handle the diversity and severity of various diseases. If it was trained on data of moderately ill patients, then it would be slow to take patients to the emergency room. On the other hand, if the training data consisted of only the seriously ill, the AI would be taking unnecessary steps for mild cases and potentially overworking the staff. Another problem arises when not enough people of diverse demographics are introduced. Sometimes unrelated things tend to look like correlations and such bias might be difficult to notice even for humans, let alone a machine that is trained to look for patterns – as the saying goes – correlation does not mean causation. Therefore, a diverse data set with as many different demographics and cases is required. Unfortunately, this is not as easy to achieve as it sounds, as some people come to consult a doctor for little details, others only come when severely ill, some trust doctors more, others less. This means that AI must be supervised and managed closely to make sure no oppression or classification of subjects would occur, but this is not easy either. Unlike people, AI can not converse and say what it thinks about its cases, it simply returns predicted the values. Specialists need to test it frequently and see whether they can see the algorithm predicting with a bias, and why that bias might have taken place. Then the training needs to be adjusted, the AI modified, and the whole implementation process takes place again, leading to a costly solution. As said before – the more diverse the problem is, the harder it is to train a good AI to handle it, so suggesting treatment or prescribing medicine is usually still done by a doctor. All in all, even though AI can do so many things and can be implemented to help people in various situations, only a few chosen solutions are carried out. Shortage of experts and difficulties obtaining training data does not help either. Fortunately, with all the new investments and businesses opening up, the future of the AI looks promising – more experts, more diverse algorithms, more opportunities and better understanding of the AI processes – everything is going towards automation, where humans will only need to do the unique and stimulating tasks, relieving us of the time consuming and repetitive tasks.
Would you like to exchange ideas with us on the subject of digital transformation and process automation without obligation? Let’s talk!