The Year of Resilience in a Data-Powered World
These three trends will help mold how we interact with and leverage data in 2022.
This is some text inside of a div block.
The onset of the pandemic served as an eye-opening experience for those in the data and analytics industry. With IDC projecting
an increase in the total amount of data being consumed to as much as 181 zettabytes in 2025 and the acceleration in digital transformation we have seen in the past two years, multiple concerns need to be addressed as we aim to thrive in a digitally transformed world.
Trend #1: Democratization of chaos engineering will begin
In distributed computing, workload cluster failures have been top of mind for organizations. However, until now this has been a concern for large enterprises only, those with a massive web presence serving real-time transactions. As the pandemic accelerated digital transformation, it pushed more consumers to an online setting. Now that all organizations rely on distributed computing deployments, we'll see a shift into chaos engineering -- a Netflix
-pioneered concept that better identifies vulnerabilities and resilience in these highly agile environments.
Chaos engineering is a process for testing a highly distributed computing platform's ability to withstand random disruptions and ultimately improving its reliability and resilience. It has been adopted by other major web companies, but it hasn't caught on with organizations running sub-hyperscale deployments, who lack the resources to leverage it -- that is, until now. Unlike centralized systems, distributed systems are more complex, and traditional software quality control approaches -- such as standard unit testing and integration testing -- will be difficult to ensure the stability of the system. We'll need to shift from deterministic testing to enhance the probability of abnormal states as quickly as possible to expose the problem.
In 2022, chaos engineering will be democratized through a new concept that is beginning to gain steam: chaos-as-a-service (CaaS). This concept will provide a quick, simple method to run chaos experiments and test systems' resiliency. CaaS will give access to a valuable but complex technology, eventually enabling organizations that aren't running at the scale of Netflix or Facebook to leverage chaos engineering and boost their infrastructure's durability.
Trend #2: Data begins to become a utility
Data powers the modern world. Organizations of all types depend on massive, rapidly growing and evolving data sets to deliver more intelligent services and achieve business growth. Given its importance and pervasiveness, data should be treated as a utility -- just like water, gas, and electricity. This means it must be made readily available and refined for easy and structured access.
Data can only become a utility with support from open source databases, data integration, and modern data management tools. Open source technology, specifically, will play a vital role, helping democratize and lower the barrier to entry for using these technologies while improving quality and reliability. Much like energy, with data's power being largely invisible, leaders will have to start by understanding their business needs and then leverage the benefits of open source and choose technologies that simultaneously fulfill those needs, whether they are scalability, availability, security, or a combination of these needs.
Trend #3: The rise of cloud-native databases
AI's ever-increasing growth highlights the vital importance of balancing its utility with the fairness of its outcomes, thereby creating a culture of trustworthy AI.
Intuitively, fairness seems like a simple concept: Fairness is closely related to fair play, where everybody is treated in a similar way. However, fairness embodies several dimensions, such as trade-offs between algorithmic accuracy versus human values, demographic parity versus policy outcomes and fundamental, power-focused questions such as who gets to decide what is fair.