Genesis Corp dba Genesis10 Sr. Cloud Data Warehouse Solution Lead in Horsham, Pennsylvania
Reference #: 21141768 Genesis10 is seeking a Sr. Cloud Data Warehouse Solution Lead for a contract with our client in Horsham, PA. 100% remote.
Job Description: The Sr. Cloud Data Warehouse Solution Lead will be a key technical resource during the solution implementation of cloud-centric data management architecture. The primary role of the Solution Lead is to take the design and technical specifications and develop effective and high-quality technical solutions that meet the business requirements. The technical domain of the Solution Lead includes understanding Data Acquisition/Integration (DA/DI) best practices, tools and technologies, security in the environment, understanding of different data warehousing and dimensional modelling concepts, DI development, performance tuning and support system testing. We are looking for a passionate professional preferably with more than 5+ years of experience in Cloud implementations. This individual should be self-driven with an ownership mindset and self-motivated to continuously learn and invest in personal development as well as keep abreast of leading-edge data management technologies in the industry
Key Responsibilities: Provide technical leadership in design and engineering for the modernization of legacy Data Ingestion, ETL/ELT and Database to new technologies in the Public Cloud (AWS/Azure), Big Data Space As a solution lead, define technical product capabilities and features as well as prioritizing and directing the product engineering delivery as part of an agile team. Self-driven, ownership mindset to navigate ambiguity, identify options to resolve constraints, mitigate implementation risks and define solutions to implementation challenges with minimal supervision. As part of the data management solution, implement appropriate data access and security patterns in alignment with appropriate regulatory and compliance standards such as HIPAA, PHI, PII, locational data restrictions, contractual, etc. Build partnership and rapport with client technical resources by demonstrating strong written and verbal communication, presentation, analytical and interpersonal skills. Ability to work closely with Business Analysts and SMEs to understand business requirements Ability to work in a matrixed team environment with distributed responsibility across different teams Create Process Flows using diagraming tools like Visio and OmniGraffle Perform Data profiling and analysis of structured and unstructured data Define new interface layouts (custom or healthcare industry standards) for integrating into the Data warehouse Create source to target mappings and ETL/ELT pipeline design for integration of new/modified data streams into on-prem/cloud-based data warehouse/data marts Create & validate ETL/ELT pipeline processes using technologies like Databricks, ADF, Informatica, Talend, SSIS, Spark, Python, Scala etc. Automate pipeline processes using scheduling tools like AirFlow Ability to adapt, learn, work and deliver in a very high paced environment with limited information at times to support business needs. Proactive and diligent in identifying and communicating scope, design, development issues and recommending potential solutions. Conduct ETL/ELT unit testing; participate in system and integration testing; identify and remedy solution defects. Required Basic Qualifications: Required Qualifications * Bachelor's Degree in Computer Science, Information Technology or related area * Minimum of 8+ years of experience designing, scaling, and/or implementing data warehouse and analytics solutions on prem and on cloud * Minimum of 5+ years of experience with Data Warehousing concepts such as ETL, data quality management, data privacy & security, and MDM along with experience in technologies such as Databricks, ADF, Informatica, Data Stage, SSIS, Talend, Collibra, Hadoop, Cloudera, Graph Database, Map-Reduce etc. * Proficiency in software delivery experience using Agile inc uding CI/CD and DevOps with constituent applications such as Jira, Rally, Aha, GitHub, Selenium, etc. * Excellent understanding of Enterprise Data warehouse data models and dimensional modeling concepts, source to target mapping and Data pipeline architectures * Minimum of 5+ years of experience working as part of data engineering teams through a product life cycle - requirements, design, development, testing and deployment * Minimum 2+ years of experience in one or more programming languages such as Python/Scala/Java * Demonstrated ability to lead small teams, meet tight deadlines, follow development standards and effectively raise critical issues with the client * Excellent oral and written communication skills * Based on the business needs and permissible travel conditions, 50% - 70% travel could be expected Desired skills: Preferred Qualifications * Minimum of 3+ years of experience designing and implementing healthcare data and information management solutions and familiarity with healthcare payer/provider data with respect to Membership, Enrollment, Claims (Medical, Pharmacy etc.) and Clinical data. * Knowledge of Healthcare Interoperability (FHIR, HL7) standards, EDI transactions and related formats including FHIR APIs * Previous design and implementation experience with enterprise data and cloud initiatives such as cloud-native data warehouses, implementation of data management & analytics solutions on Azure, AWS, and/or Google Cloud. * Minimum of 5+ years of experience designing and implementing health insurance data and information management solutions and familiarity with provider data. * Minimum 3+ years of experience in software delivery using Agile including CI/CD and DevOps with constituent applications such as Jira, Rally, Aha, GitHub, Selenium, etc. * Previous working experience in building and automate AI/Client workstream from data analysis, experimentation, operationalization, model training, model tuning to Visualization. * Understanding of digital and data engineering architecture patterns such as containerization, microservices, streaming, event sourcing etc. * Previous working experience with Java and/or Kafka streaming architectures * Previous working experience in containerization using Kubernetes, Dockers, OpenShift etc. * Previous working knowledge of cloud data engineering technologies stack, architecture patterns and best practices; Cloud Data Engineering certification would be a plus * Previous working experience in Big Data/Cloud eco system - ability to design, develop, document & architect Hadoop/Cloud applications * Configuration Management using GitHub and/or other version control tools * Experience with IaaS using Terraform, Helm deployments. * Experience in developing/deploying custom APIs Compensation: Hourly W2 pay rate $60.00-$65.00 We have access to additional contract, contract-to-hire, and direct hire positions with various rate ranges.
If you have the described qualifications and are interested in this exciting opportunity, apply today!
Ranked a Top Staffing Firm in the U.S. by Staffing Industry Analysts for six consecutive years, Genesis10 puts thousands of consultants and employees to work across the United States every year in contract, contract-for-hire, and permanent placement roles. With more than 300 active clients, Genesis10 provides access to many of the Fortune 100 firms and a variety of mid-market organizations across the full spectrum of industry verticals.
For contract roles, Genesis10 offers the benefits listed below. If this is a perm-placement opportunity, our recruiter can talk you through the unique benefits offered for that particular client. Benefits of Working with Genesis10: Access to hundreds of clients, most who have been working