Senior Data Engineer
Euronext
- Porto
- Permanente
- Horário completo
- Utilize expert knowledge of AWS services such as Lambda, Glue, Step Functions, and others to design, implement, and maintain scalable and efficient data solutions on the cloud platform
- Develop robust solution architectures and cloud infrastructure designs, considering factors such as scalability, performance, security, and cost optimization.
- Demonstrate proficiency in cloud networking, including VPCs, subnets, security groups, and routing tables, to ensure secure and reliable data transmission.
- Data Modeling: Designing efficient data models for optimal query performance.
- SQL Proficiency: Writing and optimizing SQL queries.
- Performance Tuning: Identifying and optimizing performance bottlenecks.
- ETL and Data Integration: Extracting, transforming, and loading data into Redshift, mysql, Postgresql.
- Cluster Management: Provisioning, scaling, and monitoring Redshift clusters.
- Security and Compliance: Implementing security measures and ensuring compliance.
- AWS Integration: Integrating Redshift with other AWS services.
- Monitoring and Troubleshooting: Monitoring cluster performance and resolving issues.
- Documentation and Training: Creating documentation and providing training to team members.
- Logging and Tracing: Proficiency in setting up and managing logging and tracing mechanisms in AWS, including leveraging services like AWS CloudTrail for auditing API calls and AWS X-Ray for distributed tracing and performance analysis. Understanding of best practices for logging configuration, log aggregation, and analysis to ensure visibility into system activity and troubleshooting capabilities.
- Implement orchestration solutions using tools like Apache Airflow and AWS Step Functions to automate and manage data workflows effectively.
- Utilize Athena for interactive query analysis and exploration of large datasets stored in Amazon S3.
- Provide technical leadership and guidance to the team, acting as a subject matter expert in AWS and data engineering technologies. Write comprehensive solution documents and technical documentation to communicate architecture designs, data workflows, and best practices effectively.
- Demonstrate production awareness by proactively monitoring system health, automating checks, and anticipating potential issues to ensure smooth operation of data solutions.
- Utilize strong troubleshooting skills to identify and resolve issues promptly, minimizing downtime and impact on business operations.
- Stay updated on emerging technologies and industry trends, continuously evaluating and incorporating new tools and techniques to enhance data engineering processes and infrastructure.
- Take a proactive approach to challenge business requirements and propose innovative solutions to improve efficiency, scalability, and performance.
- BS/MS degree in Computer Science, Engineering or equivalent working experience
- English (B2 or higher level)
- Expertise in AWS: Extensive experience with AWS services, particularly Lambda, Glue, StepFunctions, CloudFormation, CloudWatch, and others.
- Strong Solution Architecture Knowledge: Ability to design scalable and efficient data solutions onAWS, considering best practices for cloud architecture and infrastructure.
- Proficiency in Python and Databases: Strong programming skills in Python and experience with relational databases (MySQL, PostgreSQL, RedShift) and NoSQL databases.
- Orchestration and Workflow Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automating and managing data workflows.
- ETL Tools and Big Data Experience: Knowledge of ETL tools and experience working with large volumes of data, with a preference for Kafka experience.
- Experience with Iceberg Tables: Familiarity with Iceberg tables for managing large datasets efficiently, ensuring data consistency, and supporting ACID transactions.
- Production Awareness and Troubleshooting: Proactive approach to production monitoring and troubleshooting, with the ability to anticipate and mitigate potential issues.
- Technical Leadership and Communication: Capable of evolving into a technical lead role, with excellent communication and teamwork skills to collaborate effectively with cross-functional teams.
- Strong Analytical and Problem-Solving Skills: Ability to analyze requirements, define technical approaches, and propose innovative solutions to complex problems.
- Documentation and Requirements Analysis: Experience in writing solution documents, technical documentation, and the ability to challenge and refine business requirements.
- Knowledge in Apache Flink, Kafka, and other big data technologies.
- Experience with cloud-native architectures and serverless computing.Certification in AWS or relevant technologies.
- We respect and value the people we work with
- We are unified through a common purpose
- We embrace diversity and strive for inclusion
- We value transparency, communicate honestly and share information openly
- We act with integrity in everything we do
- We don’t hide our mistakes, and we learn from them
- We act with a sense of urgency and decisiveness
- We are adaptable, responsive and embrace change
- We take smart risks
- We are positively driven to make a difference and challenge the status quo
- We focus on and encourage personal leadership
- We motivate each other with our ambition
- We deliver maximum value to our customers and stakeholders
- We take ownership and are accountable for the outcome
- We reward and celebrate performance