Experience Inc. Jobs

Job Information

Apex Systems, Inc Data Engineer General in Dearborn, Michigan

Job#: 2021773

Job Description:

Position Description: Position Responsibilities * Design, Develop and Implement data engineering solutions using standards, templates, patterns and best practices in the Google Cloud Platform o Full stack data engineering solutions utilizing both structured and unstructured data: development, ingestion, curation, implementation, deployment, automation and monitoring o Collaborates with the Data Factory Engineering Organization, Data Architecture, PEM, GDIA, Information Technology and Data Consumers to drive data engineering capabilities, product design and proof of concepts, MVPs, to expand understanding, define technical optimization, explore configurations and overcome challenges o Create high quality, elegant data engineering solutions that focus on cloud-first, encapsulation, repeatability, automation and auditability o Work as an individual contributor and part of a team to build, test, maintain and troubleshoot data solutions * Continuously integrates and deploys data solutions via CI/CD * Use of Test Driven Development and Code Pairing Practices * Develop and maintain comprehensive documentation for data engineering processes, systems, and pipelines, ensuring clarity and transparency as well as customer requirements Skills Required: * Have Technical Documentation Skill by translating business requirements into tech specification. * Understanding of the GCP ecosystem with a focus on Big Query, DataFlow. * Capability of designing and coding analytical solutions for data collections * Capability of developing data quality and validation routines * Capability of testing data products in development procedure Skills Preferred: o Strong Oral and written communication skills o Ability to write complex SQL queries needed to query & analyze data o Ability to communicate complex solution concepts in simple terms o Ability to apply multiple solutions to business problems o Ability to quickly comprehend the functions and capabilities of new technologies. Experience Required: o 1 years of academic/work experience with one or more of the following: o Data design, data architecture and data modeling (both transactional and analytic) o Building Big Data pipelines for operational and analytical solutions o Running and tuning queries in databases including Big Query, SQL Server, Hive or other equivalent platforms o Data Management - including running queries and compiling data for analytics o Experience with developing code in one or more languages such as Java, Python and SQL Experience Preferred: * 2+ year of experience with the following: o GCP Cloud data implementation projects experience (Dataflow, AirFlow, BigQuery, Cloud Storage, Cloud Build, Cloud Run, etc.) * Experience with Agile methodologies and tools such as Rally or Jira * Certification: Google Professional Data Engineer * Experience programming and producing working models or transformations with modern programming languages * Knowledge or experience of designing and deploying data processing systems with one or more of the technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Teradata, Tableau, Qlik or Other * Strong team player, with the ability to collaborate well with others, to solve problems and actively incorporate input from various sources * Demonstrated customer focus, with the ability to evaluate decisions through the eyes of the customer, build strong customer relationships, and create processes with customer viewpoint * Strong analytical and problem-solving skills, with the ability to communicate in a clear and succinct manner and effectively evaluates information / data to make decisions * Resourceful and quick learner, with the ability to efficiently seek out, learn, and apply new areas of exp

DirectEmployers