Data Optimization Engineer
MP Solutions
- Budapest
- Állandó
- Teljes munkaidő
- Design and development of Cloud/Hadoop solutions.
- Undertaking end-to-end project delivery (from inception to post-implementation support), including review and finalization of business requirements, creation of functional specifications and/or system designs, and ensuring that end-solution meets business needs and expectations.
- Analysis of existing designs and interfaces and applying design modifications or enhancements.
- Providing insights and resolution for ad-hoc issues raised by Sandbox users.
- Testing software components and complete solutions (including debugging and troubleshooting) and preparing migration documentation.
- Providing reporting-line transparency through periodic updates on project or task status.
- Bachelor's Degree in Engineering, preferably Computer Science/Engineering.
- Minimum of 5-7 Years of working experience.
- Experienced with the technical analysis and design, development and implementation of Data Lake and cloud solutions.
- Hands on experience in Hadoop and cloud technologies; strong RDBMS concepts and SQL skills. Experience in Data Analytics tools preferably with DataIKU.
- Strong UNIX Shell scripting or Python experience to support data warehousing solutions.
- Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach.
- Excellent problem solving and analytical skills.
- Excellent verbal and written communication skills.
- Experience in optimizing large data loads.
- Proven teamwork in multi-site/multi-geography organizations.
- Ability to multi-task and function efficiently in a fast-paced environment.
- Self-starter with flexibility and adaptability in a dynamic work environment.
- Should have ability to perform the root cause of Issues and be able to design solution as required.
- Ability to gather user requirements, interface with business partners as required.
- Complete team accountability: deliverables, discipline, consistency.
- Responsible for quantitatively mapping the success of the team.
- Experience with detailed Error/Exception handling and Data Validation issues.
- Experience with common data science tools Tableau, Business Object, and a good understanding of modelling platforms i.e. Data bricks.
- Strong understanding of Data warehousing domain.
- Ability to architect an ETL solution and data conversion strategy - knowledge of different ways of Hadoop/Cloud processing Performance Optimization.
- Experience in Agile methodology.
- Experience in Developing, managing, and maintaining data dictionary and or metadata.
- Good Team player approach.