Dr. Dean Anthony Gratton, Technology Influencer, Analyst, & Futurist

Dr. Dean Anthony Gratton is a technology influencer, analyst, and futurist. He has authored several books and many contentious columns dispelling the rumours, gossip and hype surrounding new technology. Dean has worked within the wireless and telecommunications R&D industry as a software engineer and a solution architect. He has developed new products and created architectures for the consumer and industry to include the IoT, Big Data and Industry 4.0 (IIoT); the Smart Home & Cities, and the Smart Metering sectors. His wireless research work has been patented.

 

We have an uncensored belief in something we have not entirely understood.

Human-like Cognition in a Computer

This is a sentiment gathered from many years working within an industry always eager to develop the next big thing. A certain excitement and momentum combine into some kind of whirlwind, where often concepts can be typically taken out of context and exaggerated. If I go back to the original conception of AI in 1955 at Dartmouth College, New Hampshire, Dr John McCarthy and his team of research engineers, who originally conceived the term ‘artificial intelligence,’ have since been collectively recognised as the founding fathers of AI.

Today, artificial intelligence is assistive technology, and it is nothing more than clever programming and smart technology.

In his paper submitted to the college, McCarthy proposed a research project, which was only planned for two months whilst he sought funding from the university and the subsequent ‘green light’ to move forward. His proposition was incredibly ambitious, especially with the limited technology available at that time during the mid-1950s. His conjecture was nothing more than a large “What if…” – what if we could replicate certain human-like abilities in a computer using software?

A Machine Can Do It…

McCarthy and his team were particularly interested in focusing on automation and hypothesised as to whether a computer could be developed to use a language, along with self-improvement, abstract reflection, randomness, and creativity. You see, if I were to look at current research and development today, across all sectors of the industry, and then place this alongside McCarthy’s study, it would not be overly dissimilar, or unnecessarily challenging within the realms of our current objectives and what we’d like to achieve with artificial intelligence. What I realise is that, today, we are continually striving towards how effective a computer and software can be at mimicking human cognition – it’s all very much an ongoing process.

Some amazing expectations were drawn from McCarthy’s objectives, which he described as the, “artificial intelligence problem”. More specifically, he argued that “the study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can, in principle, be so precisely described that a machine can be made to simulate it.” (McCarthy et. al. “A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence,” August 1955). McCarthy and his team understood that similar research had already been undertaken and, with this in mind, they needed to afford further effort to gather more accurate results in their endeavour, which very much aligns with today’s current research.

Automation, Robotics and Industry 4.0

I regularly see the continued excitement surrounding AI in the news or online, but I also witness the ongoing hyperbole. McCarthy never envisaged that such systems would dominate and take over the world – the concept was, at that time, purely nothing more than conjecture. Yet many of us today regard AI as something that will destroy us all and take over the planet, courtesy of Hollywood, of course – and that’s certainly not the case. There’s nothing to fear despite the movies continuing to eulogise AI-powered entities as evil and malevolent. After all, today, AI is nothing more than clever programming and smart technology. It remains an assistive technology to aid both business and industry – and that’s it! However, and more realistically, there is a terrible realisation about the future of work and our workforce with the ongoing maturity of technology and its associated algorithms. Yes admittedly, as automation, robotics and other ancillary technologies continue to mature, jobs will be lost, yet this transition isn’t entirely new.

We have experienced several industrial revolutions, from water and steam, Nikola Tesla’s AC motor to the introduction of the microprocessor and, to be honest, we are still defining, what the fourth industrial revolution (4IR) will bring. The 4IR generation is also referred to as Industry 4.0 and commonly known as the Industrial Internet of Things (IIoT). With the introduction of new technology, there is an inevitable ‘technological unemployment’ hump. However, workforce has every opportunity to retrain and develop new skills, applying their newly acquired capabilities to the next industrial revolution – we adapt and evolve, something we have done since the first generation.

Until next time…

We continue in our endeavour to improve and advance technology to help humankind move forward. It’s something we have long desired, to ensure our workplaces are safer and healthier and that workers themselves are not put at risk in dangerous environments, which beneficially comes as a consequence of each industrial generational shift. We are now looking at Industry 5.0 or the firth industrial revolution, where we can place a greater emphasis on our mental health, energy, renewables and largely anything that ‘saves the world’. Nowadays, we have become so aware of how people have immense ‘life pressures’ not just from their daily job, also from such stresses as the commute to work with ongoing traffic issues, train strikes, problems with sourcing childcare and its cost, and the dark cloak of our current living crisis – we are seemingly overwhelmed and drowning in all sorts of issues.

We continue in our endeavour to improve and advance technology to help humankind move forward.

Whilst we struggle to tread water keeping out heads afloat, we are nonetheless still required to deliver what has been asked of us. The question surrounding the future of work and engagement has resurfaced, mostly due to the recent Coronavirus pandemic, which has taught us that, in many instances, we can work from home quite successfully and still deliver what needs to be done. Our video conferencing tools allow us to remain connected with our colleagues, sustaining the camaraderie needed to continue a team spirited environment, albeit virtual. With artificial intelligence, a connected and healthier environment, along with a host of associated products, applications and services we can remain connected, irrespective of our location.
So, this is where an “applying artificial intelligence for the better” Dr G, signs off.

Content Disclaimer

Related Articles