#185 History of Data-centrical Applications (revisited)

The first episode of this podcast was released 185 episodes ago. In this episode, the host Darren Pulsipher redoes episode one to provide updated information on the history of data-centric application development. He discusses how new technologies like edge computing and AI have impacted data generation and the need for better data management. Early Data Processing In the early days of computing, applications were built to transform data from one form into another valuable output. Early computers like the ENIAC and Turing's machine for breaking the Enigma code worked by taking in data, processing it via an application, and outputting it to storage. Over time, technology advanced from specialized hardware to more generalized systems with CPUs and networking capabilities. This allowed data sharing between systems, enabling new applications. Emergence of VirtualizationIn the 1990s and 2000s, virtualization technology allowed entire systems to be encapsulated into virtual machines. This decoupled the application from the hardware, increasing portability. With the rise of Linux, virtual machines could now run on commodity x86 processors, lowering costs and barriers to entry. Virtualization increased ease of use but introduced new security and performance concerns. The Rise of Cloud Computing Cloud computing is built on virtualization, providing easy, on-demand access to computing resources over the internet. This allowed organizations to reduce capital expenditures and operational costs. However, moving to the cloud meant security, performance, and integration challenges. Cloud's pay-as-you-go model enabled new use cases and made consuming technology resources easier overall. Containerization and New ComplexityContainerization further abstracted applications from infrastructure by packaging apps with their runtimes, configuration, and dependencies—this increased portability and complexity in managing distributed applications and data across environments. Locality of data became a key concern, contradicting assumptions that data is available anywhere. This evolution resulted in significant new security implications. Refocusing on Data To address these challenges, new architectures like data meshes and distributed information management focus on data locality, governance, lifecycle management, and orchestration. Data must be contextualized across applications, infrastructure, and users to deliver business value securely. Technologies like AI are driving data growth exponentially across edge environments. More robust data management capabilities are critical to overcoming complexity and risk. Security Concerns with Data DistributionThe distribution of data and applications across edge environments has massively increased the attack surface. Principles of zero trust are being applied to improve security, with a focus on identity and access controls as well as detection, encryption, and hardware roots of faith.  The Edgemere ArchitectureThe Edgemere architecture provides a model for implementing security across modern complex technology stacks spanning hardware, virtualization, cloud, data, and apps. Applying zero trust principles holistically across these layers is critical for managing risk. Robust cybersecurity capabilities like encryption and access controls are essential for delivering business value from data in the new era of highly distributed and interconnected systems.

Om Podcasten

Darren Pulsipher, Chief Solution Architect for Public Sector at Intel, investigates effective change leveraging people, process, and technology. Which digital trends are a flash in the pan—and which will form the foundations of lasting change? With in-depth discussion and expert interviews, Embracing Digital Transformation finds the signal in the noise of the digital revolution. People Workers are at the heart of many of today’s biggest digital transformation projects. Learn how to transform public sector work in an era of rapid disruption, including overcoming the security and scalability challenges of the remote work explosion. Processes Building an innovative IT organization in the public sector starts with developing the right processes to evolve your information management capabilities. Find out how to boost your organization to the next level of data-driven innovation. Technologies From the data center to the cloud, transforming public sector IT infrastructure depends on having the right technology solutions in place. Sift through confusing messages and conflicting technologies to find the true lasting drivers of value for IT organizations.