January 03, 2022

January 03, 2022

Get the most value from your data with data lakehouse architecture

A data lakehouse is essentially the next breed of cloud data lake and warehousing architecture that combines the best of both worlds. It is an architectural approach for managing all data formats (structured, semi-structured, or unstructured) as well as supporting multiple data workloads (data warehouse, BI, AI/ML, and streaming). Data lakehouses are underpinned by a new open system architecture that allows data teams to implement data structures through smart data management features similar to data warehouses over a low-cost storage platform that is similar to the ones used in data lakes. ... A data lakehouse architecture allows data teams to glean insights faster as they have the opportunity to harness data without accessing multiple systems. A data lakehouse architecture can also help companies ensure that data teams have the most accurate and updated data at their disposal for mission-critical machine learning, enterprise analytics initiatives, and reporting purposes. There are several reasons to look at modern data lakehouse architecture in order to drive sustainable data management practices.


A CISO’s guide to discussing cybersecurity with the board

When you get a chance to speak with executives, you typically don’t have much time to discuss details. And frankly, that’s not what executives are looking for, anyway. It’s important to phrase cybersecurity conversations in a way that resonates with the leaders. Messaging starts with understanding the C-suite and boards’ priorities. Usually, they are interested in big picture initiatives, so explain why cyber investment is critical to the success of these initiatives. For example, if the CEO wants to increase total revenue by 5% in the next year, explain how they can prevent major unnecessary losses from a cyber attack with an investment in cybersecurity. Once you know the executive team and board’s goals, look to specific members, and identify a potential ally. Has one team recently had a workplace security breach? Does one leader have a difficult time getting his or her team to understand the makings of a phishing scheme? These interests and experiences can help guide the explanation of the security solution. If you’re a CISO, you’re well-versed in cybersecurity, but remember that not everyone is as involved in the subject as you are, and business leaders probably will not understand technical jargon.


Best of 2021 – Containers vs. Bare Metal, VMs and Serverless for DevOps

A bare metal machine is a dedicated server using dedicated hardware. Data centers have many bare metal servers that are racked and stacked in clusters, all interconnected through switches and routers. Human and automated users of a data center access the machines through access servers, high security firewalls and load balancers. The virtual machine introduced an operating system simulation layer between the bare metal server’s operating system and the application, so one bare metal server can support more than one application stack with a variety of operating systems. This provides a layer of abstraction that allows the servers in a data center to be software-configured and repurposed on demand. In this way, a virtual machine can be scaled horizontally, by configuring multiple parallel machines, or vertically, by configuring machines to allocate more power to a virtual machine. One of the problems with virtual machines is that the virtual operating system simulation layer is quite “thick,” and the time required to load and configure each VM typically takes some time. In a DevOps environment, changes occur frequently.


Desktop High-Performance Computing

Many engineering teams rely on desktop products that only run on Microsoft Windows. Desktop engineering tools that perform tasks such as optical ray tracing, genome sequencing, or computational fluid dynamics often couple graphical user interfaces with complex algorithms that can take many hours to run on traditional workstations, even when powerful CPUs and large amounts of RAM are available. Until recently, there has been no convenient way to scale complex desktop computational engineering workloads seamlessly to the cloud. Fortunately, the advent of AWS Cloud Development Kit (CDK), AWS Elastic Container Service (ECS), and Docker finally make it easy to scale desktop engineering workloads written in C# and other languages to the cloud. ... The desktop component first builds and packages a Docker image that can perform the engineering workload (factor an integer). AWS CDK, executing on the desktop, deploys the Docker image to AWS and stands up cloud infrastructure consisting of input/output worker queues and a serverless ECS Fargate cluster.


Micromanagement is not the answer

Neuroscience also reveals why micromanaging is counterproductive. Donna Volpitta, an expert in “brain-based mental health literacy,” explained to me that the two most fundamental needs of the human brain are security and autonomy, both of which are built on trust. Leaders who instill a sense of trust in their employees foster that sense of security and autonomy and, in turn, loyalty. When leaders micromanage their employees, they undermine that sense of trust, which tends to breed evasion behaviors in employees. It’s a natural brain response. “Our brains have two basic operating modes—short-term and long-term,” Volpitta says. “Short-term is about survival. It’s the freeze-flight-fight response or, as I call it, the ‘grasshopper’ brain that is jumping all over. Long-term thinking considers consequences [and] relationships, and is necessary for complex problem solving. It’s the ‘ant’ brain, slower and steadier.” She says micromanagement constantly triggers short-term, survival thinking detrimental to both social interactions and task completion.


Unblocking the bottlenecks: 2022 predictions for AI and computer vision

One of the key challenges of deep learning is the need for huge amounts of annotated data to train large neural networks. While this is the conventional way to train computer vision models, the latest generation of technology providers are taking an innovative approach that enables machine learning with comparatively less training data. This includes moving away from supervised learning to self-supervised and weakly supervised learnings where data availability is less of an issue. This approach, also known as few shot learning, detects objects as well as new concepts with considerably less input data. In many cases the algorithm can be trained with as little as 20 images. ... Privacy remains a major concern in the AI sector. In most cases, a business must share its data assets with the AI provider via third party servers or platforms when training computer vision models. Under such arrangements there is always the risk that the third party could be hacked or even exploit valuable metadata for its own projects. As a result, we’re seeing the rise of Privacy Enhancing Computation, which enables data to be shared between different ecosystems in order to create value, while maintaining data confidentiality.

Rear more here ...

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics