Post Pandemic: The Age of AI Accleration P2

There can be little dispute that artificial intelligence is an utterly transformative technology. One that is exercising its power in our lives daily by changing our relationships with machines and with one another and ourselves. It will soon be as ubiquitous in society as the internet, and its impact is slated to be as unshakeable as it is profound. 
April 29, 2022

The pandemic tipped the proverbial scales decidedly toward AI integration into society, and even as Covid wanes, artificial intelligence is waxing strong. AI is poised to disrupt several, if not most, if not all, industries in the future. Those in the avant-garde will likely thrive. 

For product development firms looking to ride the wave--"surf's up" as opportunities abound.

AI in Manufacturing: Automation and Output

In simplified terms, when it comes to making widgets, AI will accelerate and increase automation and output. It is and will continue to be a substitute for humans doing any routine physical task and, increasingly, non-routine cognitive tasks too. The pandemic was a kind of harsh but historical eureka moment for humankind as the definition of "essential worker" contracted. An epiphany for many, the fulfillment of their own prophecies by a few: most jobs can be done remotely, many more by AI. These truisms are evident in the sharp increase in demand for industrial robots in manufacturing and many other industries over the last year. 

 The Age of AI Acceleration is the Fourth Industrial Revolution

Via artificial intelligence and machine learning, humanity has entered the fourth industrial revolution, and it is as irresistible to us as stone, steel, steam, or electricity. The downside of this, it is often said, is that AI is coming for our jobs. A more fair assessment: AI is taking over the jobs humans don't want to or can't do. In the right light, it is providing us with better ones; more likely, it is delivering the world from economic collapse. With birth rates at an all-time low, there simply aren't enough workers being born to support our existing manufacturing systems. 

"[Tesla Bot] has the potential to be a generalized substitute for human labor over time. The foundation of the economy is labor. Capital equipment is essentially distilled labor...The fundamental constraint is labor. There are not enough people. I can't emphasize this enough. There are not enough people. I think one of the biggest risks for civilizations is the low birthrate and the rapidly declining birthrate." -Elon Musk

AI Automation in Product Design and Development

Increased automation, of course, opens up myriad opportunities for product design companies on the "hardware" side of development. Still, it is important to note the AI impact has, or will continue to have, on manufacturing is much bigger than physical machinery. AI and ML were deployed in virtual systems to untangle mired global supply chains during the pandemic. Now field-tested, these new logistics technologies look to become the norm. What's more, predicting equipment failures, creating logistics contingency plans, and even researching and developing products are all well within the scope of this soon-to-be ubiquitous technology.

AI Developing Products: Opportunity or Obscelecense?

Although robots, or cobots as they are now being called, are widely suspected to eliminate blue-collar tasks--those that are dull, dirty, or dangerous--we can now add in a fourth "D," developmental. We tend to imagine that AI lacks the human touch when it comes to product development. Like everything else concerning AI, this idea is on the docket to be challenged. In a recent paper, Google laid out how it has developed a "reinforcement learning deep neural network." It designs computer chips and--wait for it--designs them faster than humans do. Much faster. What takes engineers a handful of months to accomplish took the new AI software less than a human workday--six hours. 

Google neural network computer chip

Furthermore, the chips it designed can be applied to create better AI systems, to then create better AI systems (and so on), resulting in exponential innovation gains. Of course, this doesn't mean product developers will be out of a job anytime soon; in fact, it opens up myriad opportunities in AI hardware and software product development. But, we're not gonna lie--it does feel a bit like we'll be training our own replacement.  

Fixing a Broken Healthcare System with AI

Healthcare was broken long before the pandemic, so the adoption of AI technologies to help to reduce healthcare costs, enhance patient care, and supplement a dwindling supply of medical specialists is a welcome one. Other factors include the explosion of chronic diseases and a growing population of elderly patients due to extended lifespans. All of these things necessitate solutions much larger than the scope of mere human intervention. 

Big Data: Big Opportunities in AI and ML

The vast amounts of patient healthcare data alone are reason enough to implement AI, which can handily manage, store, recall and decipher it while humans do presumably more productive things. There is also the increasing demand for personalized medicine, which, ironically, artificial intelligence may have the potential to administer better than any human. More novel technologies like machine learning are slowly being integrated into clinical settings like radiology and the early identification of diseases. All of these technologies require medical device product development or improvement of existing systems. 

The AI in Healthcare Boon

As far as the specific opportunities present in the artificial intelligence healthcare market, it is predicted that by 2030, the industry will reach $287 billion in spending in just eight years. Some of the top applications, virtual assistants, robot-assisted surgery, and wearables, are all phenomenal opportunities for product design, industrial design, and industrial engineering companies to grab a substantial piece of the emerging "AI for healthcare" pie.

 "No-Code" AI

Just like there are mountains of code behind no-code websites and apps, no-code AI has plenty of code under the hood, just not for the end-user. Much like web development, the driving force behind no-code AI is not (at this point) to bring AI to the masses but rather to fill the growing demand for AI within large organizations punctuated by a scarcity of AI engineers. Intuitive interfaces, drag and drop, and pre-built modules are all on the horizon for forward-thinking companies. What this looks like will depend on the size, scope, and niche of the businesses in which they are deployed. Some examples from large companies include P&G's AI-driven advertising spend and Walmart's supply chain AI. For smaller companies, it could be something like employing an AI algorithm to decide which sales prospects are most likely to buy. 

Redefining Product Development to Include No-Code AI 

For industrial engineering, industrial design, and product development firms, there is a real opportunity here to expand their engineering departments to include "no-code AI developers" for internal and client-facing solutions. This doesn't look like hiring specialized developers and data scientists, which would be both cost-prohibitive and overkill at this stage of the game. It involves up-skilling your current engineering staff, who are already at a level to understand and navigate the quickly evolving "drag and drop" AI programming interfaces. Just like web development, AI programming technology will one day be user-friendly for even the novice. Product development companies already have the intellectual bandwidth within their walls to take on no-code AI projects either as a value add or an additional (and very lucrative) new service offering.

The Ethical Implications of No-Code AI

These examples are fairly innocuous when you think about them in terms of an everyday company IT guy using a drag and drop AI development tool to get a shipment from point A to point B. But, what about when the algorithm being developed, sans any real ethical input, training, or oversight, is assigning credit limits? Setting bail? Running security surveillance (public or private)? In cases like these, and scores more TBD, there are major concerns about the unavoidable bias of the developer. And, there should be, considering what we've seen from AI already. In short, with the advent of the "citizen-developers," we start to see ethical issues piling up at a frightening pace.

Artificial Intelligence Ethics

 The Past is No Great Predictor of the Future, But it's All We've Got

To elucidate the possible future of AI, especially regarding ethics, we are forced to look at the past and present because those are the only clues we've got. While AI has what appears to be near-infinite potential to enhance the human experience, without ethics, legislation, and regulation, it could diminish or, at the very worst, destroy us. Social media is one of our favorite examples of so-called enlightening technology gone array. Rife with widespread and influential misinformation, it has altered the fabric of American society.  Specific to AI, fairness, inclusion, diversity, and privacy for future generations are all on the line.  Under the scrutiny of numerous studies, AI programming has been shown to be categorically biased and discriminatory when it comes to race, income, gender, and sexuality, and this appears to be widely understood.

There is already widespread development of AI ethical codes of conduct by governments, large-tech companies, non-profits, think tanks, and universities. A guideline of eighty-four principles or practices was published in Nature Machine Intelligence in September 2019. It artfully outlined several "best practices" such as transparency, explainability, interpretability, and disclosure around AI algorithms. An excellent start but one that barely scratches the surface. As the trickle-down coding of AI fast becomes a torrent, will the widespread implementation of AI outpace ethical development? Only time will tell. But, for humanity's sake, let's hope not. 

Making kindness and positivity louder.

Our work