Innovating with Artificial Intelligence – Part 1: History

by

manchestercomp_web

Sometimes it feels like we are kidding ourselves about innovation. On one hand, just about everyone in technology brags about how innovative they are or how innovative they want to be. But when I describe some really innovative scenarios, I often hear things like “That’s scary!”

Here’s an example: When you walk into a retail store, a sales clerk walks up to you, greets you by name, and offers you a sale on items that you have bought from them in the past.

Not scary enough? What if the sales clerk then asks about your spouse by name and suggests something that might be a nice gift for them – in their size, in colors they like?

While these examples are fairly simple (and easily implemented today), they just barely start to scratch the surface of smarter technology. What should really blow you away are not these simple things that we can do today, but the truly innovative artificial intelligence and deep learning we’ll be able to achieve in the very near future.

But before we get into those capabilities, let’s take a look back at how we got here:

Generations of Computing

Advances in modern computer technology were really born in the 1930’s from the work of Alan Turing (subject of the movie the Imitation Game from 2014) and other computer scientists of that age. The critical invention of the time was the digital computing device that has since become the center of the modern technology age we are currently living in today.

First Generation Computing (1950’s)

The first generation of digital computing devices relied on vacuum tubes to send electrical signals. In computing, those signals were either on (1) or off (0). The limitation of such machines was that they were very large, very expensive, and very slow. But, for the first time, calculations could be automated on a large scale with accuracy greater than most humans could perform.

Second Generation Computing (1960’s)

The invention of the transistor for computing was a huge step forward. The transistor itself contained electrical leads and a conductive material that replaced vacuum tubes at a fraction of the price and space. This critical step in the advancement of computers would start the explosive growth of computing devices that spanned well into today’s world.

Third Generation Computing (1970’s)

While the transistor is still the fundamental building block of computers, the integrated circuit provided for a greater density than individual transistors and could be used for discrete functions, such as mathematics, graphics processing, etc. The integrated circuit brought forth more powerful machines and continued to dramatically reduce the cost of building computers.

Fourth Generation Computing (1980’s)

The concept of Moore’s Law really blossomed with the introduction of the microprocessor. The microprocessor is essentially the brains of an individual computer (or the CPU – central processing unit). A microprocessor includes one or many integrated circuits arranged into smaller and smaller spaces. As Moore’s Law predicted, the number of transistors contained within a single microprocessor could double every 18 months – and it has since those early days of the microprocessor, even up to today.

The implication of the microprocessor was that we could have a computer in every business, in every home, in every car, and just about everywhere else.

Stuck in Fourth Gear

But for all the amazing advances we have made in computing, we seem to be stuck in the Fourth Generation, which has been around since the 1980’s. Previously, we saw a major break-through about every decade, but it has been 30 years since that time. Yes, there have been advances in networking, storage, and other areas of technology, however as we are effectively nearing the end of Moore’s law (where more transistors in a single chip will no longer be physically possible), we need to find another way of structuring computing devices if we want to achieve the same level of explosive advancements.

Today, most major researchers and publications predict that the next great evolution of computing devices is going to be in Artificial Intelligence (or the Fifth Generation of computing). But what is that, and how will it impact corporations struggling with their current technology investments and innovations? Find out in my next blog: “Innovating with Artificial Intelligence – Part 2: Computing for Business.”

 

READ MORE

The Magic of Mortals

The Magic of Mortals

Daily we wake up to new developments in automation, Artificial Intelligence (AI), and Machine Learning (ML). Across sectors and industries, automated solutions prove highly successful in surpassing the capacity of the human brain for certain tasks, improving...

read more
Leveling Up: How to Hone Your Skills at Home

Leveling Up: How to Hone Your Skills at Home

Leaders have been trying to crack the code on talent development for years. Recent studies have shown, however, that strength-focused leadership [read: intentionally elevating the qualities that already come naturally to us] is the clear winner for developing talent...

read more
Fake Case Study: Jack of all trades vs. Master of One

Fake Case Study: Jack of all trades vs. Master of One

  Listen to any earnings call or executive presentation and you will likely hear the terms “top line” and “bottom line.” These are words used to describe a business’s performance. According to Investopedia, the words are defined as follows: Top line refers to the...

read more
Your Personality Is Showing

Your Personality Is Showing

There I was, minding my own business one evening, digging into my organization's SEO performance (as one does), when I came across something interesting. Search terms related to "MBTI" — or the Myers-Briggs Type Indicator, developed by Katherine Cook Briggs and Isabel...

read more
Lessons From a Change Manager Who Hates Change

Lessons From a Change Manager Who Hates Change

Hello. My name is Monique, and I’m a change manager who hates change.   After years of receiving “consulting therapy” from various mentors, I am now able to say these words out loud and proudly. But for a long time, it felt more like an admission of guilt. I mean, who...

read more
Creativity as a Cure

Creativity as a Cure

The topic of creative solutioning has been front and center these days as we talk more and more about organizational adaptability in the face of dynamic and uncertain times. For example, I recently read about a project that got me thinking about specific priorities...

read more
Thought Ensemble, a Pariveda Company — Why Now?

Thought Ensemble, a Pariveda Company — Why Now?

Big news over here as we close out the year - we have been acquired by Pariveda, a 750-person consulting firm in 12 markets across North America! We are now “Thought Ensemble, a Pariveda Company” and I’ll be serving as the Managing Vice President continuing to lead...

read more
Thought Ensemble Joins Pariveda Solutions!

Thought Ensemble Joins Pariveda Solutions!

Dallas, December 9, 2021 /PRNewswire/ -- Pariveda, a leader specializing in solving complex technology and business problems, announces the acquisition of Thought Ensemble. With the addition of Thought Ensemble, Pariveda now provides holistic business strategy,...

read more
Thoughts on Colorado’s Equal Pay for Equal Work Act

Thoughts on Colorado’s Equal Pay for Equal Work Act

It was about a year ago that we first started hearing about Colorado’s Equal Pay for Equal Work Act (SB19-085) and I knew it was going to be national news. We’d just gotten past the “Rocky Mountain High” jokes, and our lovely state was trying to break new ground...

read more