Enterprise Data Architect
Northern Trust
- USA
- Permanent
- Full-time
- Define and engineer core services and practices to enable distributed development teams to modernize our data infrastructure
- Stay hands-on In developing and maintaining the framework, cookbook, and bootstrap code tor data infrastructure
- Advise, mentor and assist distributed development teams on the best data architecture approaches to solve the business problem in line with Northern's strategic data infrastructure plans
- Partner with Data Governance team to ensure the interoperability between our data tooling and the metadata collections maintained in Collibra.
- Ensure all Non-Functional Requirements (e.g., security, performance, availability and DR/fail-over, privacy, scalability, compliance etc.) have a pattern to be achieved in our strategic data infrastructure
- Evaluate, prototype, test and recommend emerging data technologies and platforms from open source or vendors.
- Lead or participate in R&D efforts to build proof of concept prototypes
- Communicate complex technical topics to non-technical business and senior executives and assist with scoping and architecting cloud data solutions.
- Create reference patterns to demonstrate compliance with Control Standards for all aspects of data architecture (safe handling of sensitive data in motion and in storage).
- At least 4 years of experience developing data solutions and services in an on-prem and/or cloud environment.
- Deep understanding of data warehouse, data lake architecture, and data management processes.
- Familiarity with Data Governance practices and tooling such as Collibra.
- Hands-on expertise of multiple modern storages and services such as Snowflake, Azure SQL, ADLS2, Azure Cosmos DB, Databricks, MongoDB, CockroachDB, DataStax, Oracle, Sybase, DB2, or Redis
- Working experience using Data Pipeline (ELT or ETL) tools such as Talend, NiFi, dBt, DataStage, or ADE
- Skilled in variety of languages such as Java, or Python
- Experience in writing scripts tor Linux shell and/or Windows PowerShell
- Strong organization and communication skills
- Bachelor's degree in Computer Science or related discipline
- Knowledge of on-prem/cloud data virtualization such as Tibco DV, Denodo, Starburst, or Dremio.
- Knowledge of data analytics technology and methodology
- Experience with DevOps, DataOps and/or MLOps using ADO
- Working knowledge of data security tools (e.g. SecuPi)
- Knowledge of Infrastructure as Code and automation such as Terraform or Ansible
- Working experience with Azure and AWS cloud data and service offerings
- Experience in the development of Control Standards for database architecture