Conceptual Data Flow

In typical Big Data environments, a layered architecture is implemented. Layers within the data processing pipeline help in decoupling various stages through which the data passes to protect the critical infrastructure. The data flows through ingestion, storage, processing, and an actionize cycle, which is depicted in the following figure along with popular frameworks used for implementing the workflow:

Figure 11.3 Conceptual Data Flow along with popular frameworks for implementing Cyber Security

Most of the components used in this figure are open source and a result of collaborative efforts from a large community. A detailed ...

Get Artificial Intelligence for Big Data now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.