Home / AI / Taking a look at Accenture’s Technology Vision 2023 – Part 2 of 2

Taking a look at Accenture’s Technology Vision 2023 – Part 2 of 2

In my last post, I took at look at the first two trends (digital identity, big/bigger data) discussed in Accenture’s Technology Vision 2023 report. This time around, I’ll see what else they had to say about the technology trends that they see being the most impactful to the convergence of the physical and digital worlds and, hence, to the enterprises Accenture works with.

Not surprisingly, AI is on their list, in this case as the trend toward Generalizing AI.

AI has been around for a good long while, but it was the 2020 release of OpenAI’s GPT-3, the largest language model to date, that really began to turn heads. What GPT-3 did was show off breakthrough capabilities, “teaching itself to perform tasks it had never been trained on, and outperforming models that were trained on those tasks.” All of a sudden, a model didn’t have to be created to perform a specific task within its data modality (e.g., text, images). We’re now heading into multimodal model territory “which are trained on multiple types of data (like text, image, video, or sound) and to identify the relationships between them,” and have hundreds of millions, even trillions, of parameters. Game changer! We’re still not replacing humans quite yet, but Accenture cites one “generalist agent” that can perform and seamlessly switch between more than 600 tasks, including chatting, captioning imaging, playing video games, and operating a robotic arm.

Generalizing AI is made possible thanks to a couple of important innovations. Transformer models:

…are neural networks that identify and track relationships in sequential data (like the words in a sentence), to learn how they depend on and influence each other. They are typically trained via self-supervised learning, which for a large language model could mean pouring through billions of blocks of text, hiding words from itself, guessing what they are based on surrounding context, and repeating until it can predict those words with high accuracy. This technique works well for other types of sequential data too: some multimodal text-to-image generators work by predicting clusters of pixels based on their surroundings.

Scale is the second innovation here. Increased computing power enables transformer models to vastly increase the number of parameters that can be incorporated in the model. (Trillions, anyone?) This yields both greater accuracy and enables the models to learn new tasks.

The Accenture report culminates in an exploration of what they term “the big bang of computing and science,” a feedback loop between technology and science where breakthroughs in one domain spur breakthroughs in the other – all occurring at hyper speed.

In this section, Accenture describes how science and technology are pushing the envelope in several different industries. In materials and energy, supercomputers operating at exascale will enable chemists to perform molecular simulations with greater accuracy, coming up with new materials to tackle problems such as climate change. As we push up against the inevitable limits of even the most powerful supercomputers, the shift to quantum computing in the chemistry field will step in.

New rocket and satellite technologies are enabling scientists to conduct more experiments in space, where the ability to work in the unique conditions of space are “accelerating what we can learn about fluid physics, diseases, materials, climate change, and more, to improve life on Earth.” A decrease in the costs of components and an increase in the involvement of the private sector mean that the once-prohibitive costs of experimentation in space are coming down. There’s even a startup offering “digital lab space.”

In biology, the computing-science “big bang” has brough about “an entirely new field: synthetic biology…[which] combines engineering principles with biology to create new organisms or enhancing existing ones.” This has implications for any number of life’s necessity: food, drugs, fuels. The cost of DNA sequencing and synthesis are having a Moore’s Law moment, cutting in half every two years. (I didn’t check the arithmetic – I’ll trust Accenture here! – but in 2001, sequencing the human genome was $100 million. Today, it’s about $600.

The Accenture report is totally free. (You don’t even have to sign up to access it.) Always interesting to see what intelligent observers have to say about what’s happening in the world of science and technology.