Often the terms “machine learning” and “artificial intelligence” (AI) are used interchangeably. However, albeit related, they are quite different principles.  

 

AI is the more overarching term. It refers to a range of technologies that work as a system to enable the mimicry of functions that are usually associated with human cognition (or intelligence). This would include functions such as thinking, reasoning, remembering, learning, imagining, creating and communicating.  You can probably think of AI as attempting to simulate the human brain.    

In contrast, you can think of machine learning as simulating a single neural pathway within the brain and not at all able to simulate the entire complexity of the human brain or human cognition. This is because machine learning is a subset of AI (along with a range of other subsets, such as natural language processing and robotics) that focuses only on the “learning” part of the mimicked cognition.  Machine learning is therefore an application of AI.  

In machine learning, algorithms are used to analyse data, look for patterns or insights (i.e. learn from the data) and then make/suggest decisions based on those learnings. Effectively, the more data and more “runs” that the machine learning model is exposed to, the more it learns and improves.  Hence, the outcome can be the improved learnings and/or an improved machine learning model. The model is learning autonomously (i.e. without human input) based on past data, its analysis and resultant insights.  The goal is to increase the accuracy of the desired output. They are usually reliant upon statistical models to learn and autonomously correct/improve.  I like to think of the learning as a process of evolution – with the decision-making evolving to become more and more accurate and closer to the optimal outcome. 

 

What is an example of AI and machine learning that I would be familiar with and will help me to understand the difference? 

By now most of us will have dabbled with ChatGPT, which is an example of AI in that it is getting pretty good at simulating the human brain. It feels like ChatGPT is thinking, remembering, creating and communicating, much like human cognition. On the other hand, machine learning might feel less “intelligent” . An example is autonomous vehicles learning how to navigate, avoid collisions and route disruptions.  However, it is easy to see how machine learning can help to make AI better. 

What types of industrial “neural pathways” are machine learning models best suit to?   

Machine learning can assist  manufacturing, fabrication and construction in many ways, particularly applications where:

  • faster or more reliable decision-making is required (e.g. in identifying a desired visual indicator of manufacturing quality);
  • human error is a common concern (e.g. where the human eye is not reliably distinguishing between something of good or poor quality);
  • decision-making needs to be data driven and the data is complex or large (e.g. where there are vast numbers of images requiring fast assessments of quality); and
  • customisation and adaptability is required (e.g. where the indicators of quality vary across different jobs).

 

What steps should I undertake to evaluate the relevance of machine learning to my business? 

The order may vary but generally the steps would include the following. 

  1. Determine if you have a need for machine learning (e.g. is it for an application to solve a problem listed above?). 
  2. Determine if you have the ability to readily collect relevant data. 
  3. Assess if that data is of high enough quality and reliability to train AI models. 
  4. Build a business case- will automating this have benefit? Does the benefit justify any further investment?  
  5. Create capability- either by training existing staff, hiring new or contracting in consultants (you may need someone to assist even at the early stages). 
  6. Create rules for data collection, storage, use,  sharing and protection and ensure legislative compliance (i.e. a data governance framework). 
  7. Evaluate solutions- some can be off the shelf, others may be bespoke. 
  8. Undertake a pilot or proof of concept trial (with a minimum viable product). 
  9. Evaluate effectiveness and review the business case. 

 

HERA has an Industry 4.0 Cluster, (which by the way, is an even more overarching term than AI!) that you can join if you are interested in learning more about Industry 4.0 technologies, such as AI, augmented reality, robotics, and digital twinning. Contact Holger Heinzel for more information about this cluster.