
Risk Data Engineer Senior-JD-PLSQL
- Pune, Maharashtra
- Permanent
- Full-time
- Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable variety of use cases.
- Driving improvements in the reliability and frequency of data ingestion including increasing real-time coverage
- Support and enhancement of data ingestion infrastructure and pipelines.
- Designing and implementing data pipelines that will collect data from disparate sources across enterprise, and from external sources and deliver it to our data platform.
- Extract Transform and Load (ETL) workflows, using both advanced data manipulation tools and programmatically manipulation data throughout our data flows, ensuring data is available at each stage in the data flow, and in the form needed for each system, service and customer along said data flow.
- Identifying and onboarding data sources using existing schemas and where required, conduction exploratory data analysis to investigate and provide solutions.
- Evaluate modern technologies, frameworks, and tools in the data engineering space to drive innovation and improve data processing capabilities.
- 3-8 years of expertise in designing and implementing data warehouses, data lakes using Oracle Tech Stack (DB: PLSQL)
- At least 4+ years of experience in Database Design and Dimension modelling using Oracle PLSQL.
- Should be experience of working PLSQL advanced concepts like ( Materialized views, Global temporary tables, Partitions, PLSQL Packages)
- Experience in SQL tuning, Tuning of PLSQL solutions, Physical optimization of databases.
- Experience in writing and tuning SQL scripts including- tables, views, indexes and Complex PLSQL objects including procedures, functions, triggers and packages in Oracle Database 11g or higher.
- Experience in developing ETL processes - ETL control tables, error logging, auditing, data quality etc. Should be able to implement reusability, parameterization workflow design etc.
- Advanced working SQL Knowledge and experience working with relational and NoSQL databases as well as working familiarity with a variety of databases (Oracle, SQL Server, Neo4J)
- Strong analytical and critical thinking skills, with ability to identify and resolve issues in data pipelines and systems.
- Strong understanding of ETL methodologies and best practices.
- Collaborate with cross-functional teams to ensure successful implementation of solutions.
- Experience with OLAP, OLTP databases, and data structuring/modelling with understanding of key data points.
- Experience of working in Financial Crime, Financial Risk and Compliance technology transformation domains.
- Certification on any cloud tech stack.
- Experience building and optimizing data pipelines on AWS glue or Oracle cloud.
- Design and development of systems for the maintenance of the Azure/AWS Lakehouse, ETL process, business Intelligence and data ingestion pipelines for AI/ML use cases.
- Experience with data visualization (Power BI/Tableau) and SSRS.