Webinar: Altis Data Load Scaffolding Using Snowflake
by Anup Ramadas – Principal Consultant, Gordon Sinclair – Managing Consultant & Manan Patel – Senior Consultant
Built using over 20 years of experience, the Altis Data Load Scaffolding (DLS) helps Altis’ clients simplify their Data Warehouse development, operation and maintenance.
In this webinar, we discussed how the DLS is used to help our clients efficiently build a robust and reliable data warehouse. The case studies presented highlighted the practical applications of key DLS features.
What is DLS?
The DLS is an Extract, Load and Transform (ELT) accelerator built to expedite SNOWFLAKE data loads. DLS helps in the fast ingestion and transformation of data using pre-set code and guided patterns.
As shown in the illustration, the toolset is used to orchestrate and schedule DLS data load and transform patterns. The behaviour of data load and transform patterns are defined in the metadata control layer of DLS. The data engineers and developers define the metadata controls.
Key features and benefits
Accelerated development – extract of data from new or existing sources is expedited using the Load Patterns provided in the DLS. These load patterns are used at the source object level, so different patterns can be used to extract data from the same source system. These load patterns allow developers to focus more time on implementing the business logic of the data warehouse.
Reduces test cycle – the Load Patterns are standardised in the DLS, the same code base is reused across all load patterns in the solution, and this means that the development team does not test data load code individually, reducing the traditional testing cycle by a significant factor.
Low cost of maintenance and ownership – the entire platform is built using ‘SQL’. The load patterns and transform logic are all deployed using ‘ANSI SQL’. Basic knowledge of SQL and data warehousing concepts are sufficient to maintain and manage a data warehouse platform built using DLS.
Elastic compute – since DLS runs in SNOWFLAKE. Upgrading the capacity of DLS for data load and transformation can be easily implemented by adjusting SNOWFLAKE compute. DLS also allows for a dedicated computer to be assigned to a source load.
Adaptability – the DLS is designed as modular and with a loosely coupled codebase. It can be adapted for almost any data warehouse use-case and scenario. Since its introduction, DLS has been used to source data not just from traditional database and files but also from streaming and time-series data sets.
Extendibility – DLS aids organizations in building a robust data warehouse platform. The same patterns can be extended to build other data platforms like a data quality engine or a data reconciliation engine.
Proven pattern – The DLS has been implemented across wide-ranging business verticals like utility companies, financial firms, media, and transport. It has now been delivered or is being implemented for 10+ clients.
Interested in watching more of our past webinars? View our previous Free Data & Analytics webinar recordings here.