Hard to believe that the artificial intelligence Netflix uses to recommend shows today was around even before the streaming service was originally set up to send out DVD rentals by post.
"A lot of the algorithms were invented in the 80s," explains Dr Haytham Assem, a chief scientist in Huawei Ireland Research Centre.
Perhaps AI’s most pivotal moment came with the publication of “Learning representations by back-propagating errors”, an academic paper in which a new learning procedure, back-propagation, for networks of neuron-like units was introduced.
“This is the traditional learning method for how advanced AI architectures gets trained nowadays. Without back-propagation, most advancements in deep learning would not have been possible,” says Assem.
It took a subsequent rise in computer power, fibre-optic cables and modern semi-conductor chips to accelerate widespread progress.
“The invention of 3G and 4G was a stepping stone in enabling large quantities of data to move back and forth at rapid speeds,” he explains.
“It would not have been possible for any of the video streaming services that we enjoy now to be possible without these. Without these communication technologies, the truth is that it would have not been possible to enable the AI app economy that we are experiencing now.”
Modern AI chips developed in the last decade don’t just provide raw power but are optimised to carry the typical maths problems of AI algorithms more quickly, he adds.
"Without the progress of these chips, it would have not been possible to think about autonomous vehicles, video analytics, object detection and data centre acceleration. For instance, the latest chips from Intel released in 2020 can run over 10 trillion calculations per second. To give you an idea of the scale, if you stacked a dollar bill on top of one another for each calculation, the stack would go to the moon and back, then halfway back to the moon again," says Assem.
These advances in computing power are enabling AI to deliver added value for businesses. They are, for example, the power behind chatbots providing hours support, points out Mark Jordan, chief technologist with Skillnets Ireland, the workforce learning agency.
“More than 90 per cent of businesses have adapted machine learning of one form or another to drive their business forward, providing yields in areas such as enhanced decision making, helping to boost productivity, removing repetitive tasks and moving the workforce up the value chain,” says Jordan.
AI has streamlined a range of industries, from automotive to logistics and healthcare. “We’re seeing it in public transportation, where heat maps and route demand feeds into time tabling,” he says.
Some workers will be replaced by automatisation, but new jobs will be introduced
Irish company CitySwift, developer of a bus data engine, is used by local authorities to match bus schedules with passenger demand, for example.
AI is everywhere, from forecasting weather to filtering spam emails, to enabling search predictions in Google and voice recognition services such as Apple's Siri and Amazon's Alexa.
"In the last 10 years, AI and machine learning (ML) are having a huge impact on our lives. Healthcare, security, business, economics, finance, and education have experienced the most important changes," says Alessia Paccagnini, academic director of the master in quantitative finance at the Michael Smurfit Business School, University College Dublin.
“The benefits of implementing AI in finance, in particular for task automation, fraud detection, and delivering personalised recommendations, are monumental.”
But its rise is not without risk. Lack of governance around the core data used and augmented by AI and ML, and how it is used, is one. “The European Commission is really keen to build regulation around AI, which is necessary,” says Mark Jordan.
And while AI has significant potential to boost economic growth and productivity, “at the same time, it creates equally serious risks of job market polarisation – the necessity to have high-skilled jobs since low-skilled ones will be substituted by automatisation – rising inequality, structural unemployment, and the emergence of new, undesirable industrial structures,” says Paccagnini.
“For this reason, new regulations, at national and international levels, are necessary and important to avoid distortions and risks involved in the adoption of AI.”
AI offers potential for advances in sectors such as healthcare, cybersecurity, economics, business, manufacturing, supply chain management, construction, and retail, as well as information technology, finance, climate change, education, and travel.
“Some workers will be replaced by automatisation, but new jobs will be introduced and also education systems and universities will start to adapt their courses including new technologies and offering new programmes,” she explains.
They are already emerging. Researchers at UCD and at its Michael Smurfit Graduate Business School are working on the application of AI to cybersecurity, fintech, and digital services.
UCD’s master’s degree programmes in finance, quantitative finance, and financial data science – its most recent – all include advanced courses in the statistical learning and machine learning that build the basic concepts for AI applications in finance.
Trinity College Dublin has a newly launched artificial intelligence accelerator programme, Alsessor, while the University of Limerick is pioneering new integrated undergraduate and master's degrees in partnership with technology companies such as Stripe, Zalando, Shopify and Intercom, which will support new AI based start-ups.
Ireland already has the highest share of enterprises in Europe using artificial intelligence, according to Eurostat.
Government-backed research centres help, including CeADAR, which specialises in innovation and applied research and development in such areas as AI, machine learning and data analytics.
The pandemic has helped fuel AI’s progress too.
The emergence of a hybrid workforce will imply more collaborative experiences with AI
“During the pandemic we have seen how AI is already changing the way businesses operate, from how they communicate with their customers via virtual assistants, to automating key workflows, and even managing network security,” says Assem.
This, he reckons, will only grow in the predicted hybrid working world of the future.
“Undoubtedly, the emergence of a hybrid workforce will imply more collaborative experiences with AI,” he explains.
“The human workforce will need to have access to more automated bots and various digital assistants that can support a hybrid model. It will be collaboration between the workforce and AI, one will not replace the other.”