Experience Inc. Jobs

Job Information

V2soft Inc. Data Engineer General in Dearborn, Michigan

V2Soft ([www.v2soft.com**) is a global company, headquartered out of Bloomfield Hills, Michigan, with locations in Mexico, Italy, India, China and Germany. At V2Soft, our mission is to provide high performance technology solutions to solve real business problems. We become our customer's true partner, enabling both parties to enjoy success. We are committed to promoting diversity in the workplace, and believe it has a positive effect on our company and the customers we serve.

Description:

Key Responsibilities: **](http://www.v2soft.com)

  • Data Pipeline Development: • Design, build, and maintain scalable and robust data pipelines on GCP using tools such as Apache Airflow, Cloud Composer, and Cloud Dataflow. • Implement data integration solutions to ingest data from various sources, including cloud storage, and third-party APIs.
  • Data Warehousing: • Develop and optimize data warehouse solutions using BigQuery and other GCP services. • Ensure data accuracy, consistency, and security within the data warehouse environment. • Monitor and troubleshoot data pipeline and warehouse issues to maintain system reliability.
  • Cloud Platform Expertise: • Utilize GCP services such as Cloud Storage, Cloud Run, and Cloud Functions to build scalable and cost-effective data solutions. • Implement best practices for cloud infrastructure management, including resource provisioning, monitoring, and cost optimization.
  • Collaboration and Communication: • Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality data solutions. • Collaborate with cross-functional teams to design and implement data models, ETL processes, and reporting solutions.
  • Automation and Optimization: • Develop automated workflows using Apache Airflow and Astronomer to streamline data processing and improve efficiency. • Continuously optimize data pipelines for performance, scalability, and cost-effectiveness.
  • Documentation and Training: • Create and maintain comprehensive documentation for data pipelines, data models, and infrastructure components. • Provide training and support to team members and stakeholders on data engineering best practices and GCP services.

Skills Required:

  • Proficiency in data pipeline tools and frameworks such as Apache Airflow, Cloud Composer, and Cloud Dataflow.
  • Strong knowledge of GCP services, including BigQuery, Cloud Storage, Cloud Run, and Cloud Functions.
  • Experience with SQL, Python, and other programming languages commonly used in data engineering.
  • Familiarity with data modeling, ETL processes, and data integration techniques.
  • Soft Skills: • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration abilities.
  • Ability to work independently and as part of a team in a fast-paced, dynamic environment.

Experience Required:

  • 5+ years in data warehouse and 2+ years in GCP

Education Required:

  • Bachelors in Science

Education Preferred:

  • Masters in Science

V2Soft is an Equal Opportunity Employer ( EOE).

https://www.v2soft.com/careers

DirectEmployers