The Top Technology Trends and Their Impact on Data Science, Machine Learning and AI | by Isabelle Flückiger | Jan, 2021


The last block is called resilient delivery. Resilience means “the ability of a substance to return to its usual shape after being bent, stretched, or pressed.” While companies focused in the past years on optimized, efficient operations, COVID-19, and the current recession hit them hard in their fragile processes. So, technology-driven resilience is the new focus to recover fast.

7. Intelligent composable business

What it is: While rebuilding the business and processes, a design that enables better access to information, augment it with new insights, is composable, modular, and can change and respond more quickly to decisions and disruption is needed; a so-called intelligent composable business. The focus is on the autonomy of decision making, the democratization of applications, and business capabilities. The plasticity of a company is key.

My opinion on the impact: That description of the trend is a bit abstract. My interpretation is the following: During phases of change, people and organizations must be enabled to make real-time, relevant, and contextual business decisions. That cannot be done anymore with centralized decision-makers. With relevant data and insights, the decisions must be made decentralized, and nearly simultaneously, the capabilities must adapt to implement them. So, people in an organization need to be empowered for that. The impact will be that everybody in the organizations should be a citizen data scientist, “a person who creates or generates models that use advanced diagnostic analytics or predictive and prescriptive capabilities, but whose primary job function is outside the field of statistics and analytics.” On the one hand, people from outside of the classical data science tracks enter and perform these tasks together with a lot of automation.

On the other hand, data scientists need a clear differentiating profile to be recognized as experts to develop and implement advanced applications. The data science end-to-end process will be more fragmented by automation, citizen data scientists, and specialized data scientists. Business decisions and communications skills of the data scientist become more critical than ever.

My advice for action: Data science automation will evolve. So, you as a data scientist, make sure with education in advance topics that you stay relevant. Start with advanced training now, and achieve especially cloud-related certificates or specializations. Citizen data scientists will perform mid-level complex tasks. Also, get trained in business decision making and communication.

Second, for not yet data science people, it opens up many entry opportunities. You should start with data science foundation education and sound business analytics skills. You do not need to be a coding expert but should be able to work with tools like R or Tableau.

8. AI engineering

What it is: According to Gartner, the performance, scalability, interpretability, and reliability of AI models need robust AI engineering. Without AI engineering, most organizations will fail to move AI projects beyond proofs of concept and prototypes to full-scale production.” The three pillars of AI engineering are DataOps, ModelOps, and DevOps. DevOps deals mainly with high-speed code changes, but AI projects experience dynamic changes in code, models, and data, and all must be improved. Organizations must apply DevOps principles across the data pipeline and the machine learning model pipeline.

My opinion on the impact: Currently, still 80–85% of AI projects do not deliver the intended outcome. So, this is a trend where neither the companies nor you do have any other choice. It is a must. Successful tech companies are already working with this mindset. All others need that, too, to stay relevant.

My advice for action: My advice is concise: learn it. Apply it. And use all the corresponding productivity tools that are associated with it.

One last word: DevOps, DataOps, ModelOps, and MLOps are not a tool, not a technology, not a framework, and not a methodology. It is a way, a mindset, a culture, a philosophy of working, and most importantly, learning. Bear that always in mind.

9. Hyperautomation

What it is: Gartner says that “hyperautomation is a process in which businesses automate as many business and IT processes as possible using tools like AI, machine learning, event-driven software, robotic process automation, and other types of decision process and task automation tools.” The end-to-end digitalization ensures not only seamless remote work but also digital operational excellence and operational resilience.

My opinion on the impact: More and more companies move to data-driven business models with the need for a fast reaction to the market and customers. The companies are already working on it. Reasons: speed to market, competitive advantages, lack of resources like data scientists, and the dependencies on them. Hyperautomation shifts the tasks of data scientists. They move from low-level business analytics and data analyst work to automation and outcome-oriented tasks. Your duties include end-to-end (quality) controlling and oversight, working with automation tools, full integration into business processes, and providing the corresponding AI and machine learning support.

My advice for action: This trend requires you to develop your skills in two directions. Get familiar with end-to-end platforms (KDnuggets has a summary of Gartner’s Magic Quadrant), working frameworks (see above no. 8), and programming languages like C/C++, Java, Go, Rust, etc. Python is not a language for hyperautomation. Second, understand the business side, what drives customer experience, and learn how to do “oversight” instead of only “implementation.” You will be the “air traffic controller,” not the pilot.

Read More …


Write a comment