Share

Scientists developed software to make artificial intelligence systems more trustworthy

Artificial intelligence is going to overtake many industries. They should make our lives easier, but many people are more aware of the dangers than of possibilities. Now scientists from the University of Waterloo developed a special software, which should increase people‘s confidence in AI decisions in financial sector.

AI systems can make very accurate predictions, but people are reluctant to trust those predictions without knowing reasoning behind them. Image credit: Allan Ajifo via Wikimedia(CC BY 2.0)

Artificial intelligence is bound to take over financial sector because of how powerful it can be as a tool. Its deep learning algorithms could gather so much information in such a short period of time allowing AI system to make reliable predictions. It can think about many variables at the same time and spot patterns in the market before it is visible to analysts. AI would be very useful in stock market predictions, qualifying people for mortgage, setting insurance premiums and so on. But for that they have to be dead reliable – for now people don’t trust them. This new software should put that missing piece of a puzzle back – it will provide the reason behind AI’s decisions.

Reasons is what regulators want to know and analysts take into account deciding if predictions are reliable. AI systems essentially could be used to make very accurate predictions right now, because they use these deep learning algorithms to detect and process patterns in vast quantities of data. The amount of information is so big that people operating the algorithm do not even know its reasoning. Scientists created such algorithm, predicting next-day movements on the S P 500 stock index by using information from the previous 30 days. Then they developed software called CLEAR-Trade, which outlines all the factors it used and from which days. In other words, CLEAR-Trade allows analysts to see how AI made its decision and assess if it makes sense.

Ability to explain AI decisions is going to be so important in the nearest future in order to build trust between the machine and the person. Devinder Kumar, lead researcher of this study, said: “If you’re investing millions of dollars, you can’t just blindly trust a machine when it says a stock will go up or down. This will allow financial institutions to use the most powerful, state-of-the-art methods to make decisions”.

This is quite interesting from another angle as well. It seems like those who will have control over AI will be able to earn big in financial markets. And where is money, there is more money to be made – we have no doubt that software explaining decisions of AI systems will be very popular.

 

Source: University of Waterloo

Comment this news or article