WTW Senior Data Engineer (Airflow, Data Factory, Databricks) in Lima, Peru
As the Senior Data Integration Engineer at TRANZACT, you will play a critical role in driving a digital transformation initiative to seamlessly migrate our RDBMS-based Enterprise Data Warehouse and custom Enterprise integration platform to an innovative Next-Gen Data + AI ecosystem, anchored in the Databricks Platform and DataMesh Architecture. We are seeking a hands-on Data Integration Engineer who has deep technical expertise in Apache Airflow, Azure Data Factory and Databricks and understands how build and scale a performant data platform and enjoys enabling other team members to do the same.
Lead the design and implementation of the data & analytics architecture ensuring compliance, quality and a sustainable growth of the platform.
Build scalable data pipelines to integrate and model datasets from different sources that meet functional and non-functional requirements.
Provide the technology, the tools and data to support data scientists and analysts.
Manage the technical backlog on the Data & Analytics Platform.
Ensure the data quality by implementing and monitoring a data quality framework together with the data source teams.
Work with data infrastructure to triage infrastructure issues and drive to resolution.
Collaborate with cross-functional stakeholders to understand their business needs.
Formulate and complete end-to-end analysis that includes data gathering, analysis, ongoing scaled deliverables.
Foster a data driven culture throughout the team and to participate in data science projects that will have impact throughout the organization.
BS degree Computer Science or related field
5+ years relevant experience as Data Engineer, building Data Platforms, Datalakes and Business Intelligence solutions. Experience on Databricks preferred.
Proficient implementing data pipelines (using i.e. Pyspark, Spark SQL), orchestration tools/services (i.e. Apache Airflow, data factory) and testing frameworks.
Experience in one of the main cloud services (AWS or Azure) with Big Data services (Databricks, Kinesis, etc.). Experience in leveraging AWS API Gateway, AWS Lambda and AWS AppSync strongly preferred.
Experience in databases (columnar, nosql and relational databases: Dynamodb, Aurora, Postgres), data modeling and data management tools (Databricks).
Experience with reporting systems and visualization tools like PowerBI
Experience working in Agile methodologies
Experience with DevOps, DataOps and MLOps.
Experience in Data Science and Machine Learninge
Knowledge on Data Mesh Architectures
Knowledge of the Insurance Industry.
**We are an equal opportunity employer
- WTW Jobs