Experience Inc. Jobs

Job Information

COGNIZANT TECHNOLOGY SOLUTIONS US CORP. AWS Data Bricks Engineer- Remote in Grove City, Ohio

Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process outsourcing services, dedicated to helping the world's leading companies build stronger businesses. Headquartered in Teaneck, New Jersey (U.S.). Cognizant is a member of the NASDAQ-100, the S&P 500, the Forbes Global 1000, and the Fortune 500 and we are among the top performing and fastest growing companies in the world. Practice - AIA - Artificial Intelligence and Analytics About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future-a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies. By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models. Cognizant's AIA practice takes insights that are buried in data, and provides businesses a clear way to transform how they source, interpret and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into informative, meaningful intelligence. This role does not support visa dependent candidates. Application expire date: 11/10/2024 Job Summary We are seeking a highly skilled Sr. Developer with 8 to 10 years of experience to join our team. We are seeking a highly skilled and motivated Data Engineer Contractor role in our team. The ideal candidate will have extensive experience in AWS Databricks Python Trillium Teradata SQL Unix scripting and Informatica PowerCenter. This role requires a strong understanding of data engineering techniques data cleansing and matching processes as well as excellent problem-solving. Responsibilities Key Responsibilities * Utilize Trillium for data cleansing standardization and matching processes with a focus on US Census data file matching. * Manage and optimize AWS services including S3 EFS EBS Lambda and IAM roles. * Perform data engineering tasks using Databricks including integrating JSON files from S3 into the raw layer and applying best practices. * Develop and maintain Python scripts for data processing and automation. * Extract data from various data stores including relational databases and file structures such as CSV XML and JSON. * Use Teradata utilities (BTEQ Fast Load Multi Load) for data extraction and manipulation. * Write and maintain Unix shell scripts including wrapper scripts and monitor Unix logs for errors. * Create and troubleshoot complex SQL queries for backend testing and production issue resolution. * Utilize Informatica PowerCenter Client tools (Mapping Designer Repository Manager Workflow Manager/Monitor) for ETL processes. Qualifications * Expertise in Trillium Control Centre cleansing standardization and matching processes. * Strong knowledge of AWS services (S3 EFS EBS Lambda) and IAM roles. * Proficient in Databricks data engineering and best practices. * Advanced skills in Python programming. * Experience with Teradata and its utilities (BTEQ Fast Load Multi Load). * Proficient in SQL and Unix scripting. * Experience with data extraction from various data stores and file structures. * Strong proficiency in Informatica PowerCenter Client tools. * Excellent problem-solving skills and attention to detail. Perform as Developer and Data Analyst as needed. * Well-organized quick learner and self-motivated. * Effective verbal and written communication skills.

DirectEmployers