Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.

invito all'azione

Get Started

cloud

Sei pronto per cominciare?

Scarica Sandbox

Come possiamo aiutarti?

chiudiPulsante di chiusura
invito all'azione

Apache Hadoop Data Warehouse Architecture for EDW Optimization

Reduce Costs by Moving Data and Processing to Hadoop®

cloud Hortonworks is a leader. Read the Forrester Wave.

DOWNLOAD Report

What is an EDW?

Enterprise Data Warehouse (EDW) is an organization’s central data repository that is built to support business decisions. EDW contains data related to areas that the company wants to analyze. For a manufacturer, it might be customer, product or bill of material data. EDW is built by extracting data from a number of operational systems. As the data is fed into EDW it is converted, reformatted and summarized to present a single corporate view. Data is added into the data warehouse over time in the form of snapshots and normally an enterprise data warehouse contains data spanning 5 to 10 years. A Hadoop data warehouse architecture enables deeper analytics and advanced reporting from these diverse sets of data.

EDW Optimization

Problems with a typical EDW

The Enterprise Data Warehouse has become a standard component of the corporate data architectures. However, the complexity and volume of data has posed some interesting challenges to the efficiency of existing EDW solutions.

Realizing the transformative potential of Big Data depends on the corporations’ ability to manage complexity while leveraging data sources of all types such as social, web, IoT and more. The integration of new data sources into the existing EDW system will empower corporations more and deeper analytics and insights. More importantly, EDW optimization using Hadoop provides a highly cost-efficient environment with optimal performance, scalability and flexibility.

Elementi della soluzione

Hortonworks Data Platform

*

Powerful open Hadoop data warehouse architecture with capabilities for data governance and integration, data management, data access, security and operations—designed for deep integration with your existing data center technology. Learn More

Syncsort

*

EDW offload to Hadoop - High-performance ETL software to access and easily onboard traditional enterprise data to HDP. Learn More
 
 

SERVIZI PROFESSIONALI

*

Guida e assistenza da parte di personale esperto per verificare rapidamente il valore della nuova architettura e ottenere il massimo da una soluzione di ottimizzazione dell'architettura di dati Hortonworks convalidata e completamente testata. Altre informazioni

EDW optimization with Apache Hadoop ®

Flexible

*

Data can be loaded in HDP without having a data model in place

*

Data model can be applied based on the questions being asked of data (schema-on-read

*

HDP is designed to answer questions as they occur to the user

Efficient

*

100% of the data is available at granular level for analysis

*

HDP can store and analyze both structured and unstructured data

*

Data can be analyzed in different ways to support diverse use cases

Cost Effective

*

HDP (Hortonworks Data Platform) is 100% open - there is no licensing fee for software

*

HDP runs on commodity hardware

*

New data can be landed in HDP and used in days or even hours

Use-Cases on EDW Optimization

CASO D'USO 1
immagine media

BI rapida su Hadoop

I sistemi EDW proprietari sono stati adottati in passato per poter sfruttare una BI rapida e sistemi di analisi profondi e dettagliati; tuttavia il prezzo dei sistemi EDW è elevatissimo e inoltre questi sistemi non si sono adattati alle sfide dei big data odierni, come i dati non strutturati e le analisi su larga scala.

Hortonworks makes fast BI on Hadoop a reality, with the combination of a fast in-memory SQL engine to create data marts with an OLAP cubing engine that lets you query huge datasets in seconds. This gives you the choice of querying pre-aggregated data for maximum performance or in full-fidelity form when the nest grains of detail are needed, allowing access from any major BI tool that supports ODBC, JDBC or MDX.

Altre informazioni

CASO D'USO 2
immagine media

PROCESSI ETL INTEGRATI PER HADOOP

A typical EDW spends between 45 to 65 percent of its CPU cycles on ETL processing.These lower-value ETL jobs compete for resources with more business-critical workloads and can cause SLA misses. Hadoop can EDW offload these ETL jobs with minimal porting effort and at substantially lower cost, saving money and freeing up capacity on your EDW for higher-value analytical workloads. Hortonworks makes it easy by providing high-performance ETL tools, a powerful SQL engine and integration with all major BI vendors.

Altre informazioni

CASO D'USO 3
immagine media

ARCHIVIAZIONE DEI DATI IN HADOOP

Il continuo aumento dei volumi di dati e dei costi obbligano molte aziende ad archiviare i vecchi dati su nastro, rendendo impossibile l'analisi e costoso l'eventuale recupero dei dati.

A Hadoop data warehouse architecture offers cost per terabyte on par with tape backup solutions. Because of the appealing cost, you can store years of data rather than months. All of your enterprise data remains available for retrieval, query and deep analytics with the same tools you use on existing EDW systems.

Altre informazioni