Experience Inc. Jobs

Job Information

Nielsen Technical Manager (Big Data, Java/Scala, AWS, Spark on Kubernetes) in Bangalore, India

At Nielsen, we believe that career growth is a partnership. You ultimately own, fuel and set the journey. By joining our team of nearly 14,000 associates, you will become part of a community that will help you to succeed. We champion you because when you succeed, we do too. Embark on a new initiative, explore a fresh approach, and take license to think big, so we can all continuously improve. We enable your best to power our future.

Requirements

  • As a Technical Manager, you will be leading a team of engineers (DevOps team) focusing on analyzing, developing, testing, and supporting Big Data applications deployed in Cloud (AWS) infrastructure.

  • Your primary objective is to ensure project goals are achieved and are aligned with business objectives. You will also work closely with your Scrum team and program team to test, develop, refine and implement quality software in production via standard Agile methodologies.

Responsibilities:

  • Result oriented leader to manage staff, plan and evaluate goals and objectives to align with Technology strategy through partnership with senior management and stakeholders.

  • Hands on Leader to oversee highly motivated DevOps engineers in technical direction and day to day activities for technical project implementations.

  • Interface with the product manager to review roadmap and requirements; participant in agile/scrum ceremonies to lead the team in delivering on commitments.

  • Work closely with the reporting manager, architect, and team to review the architecture, design, and implementation of solutions to ensure they are highly robust.

  • Collaborate with cross-functional teams to oversee end-to-end integrations, data contracts, and compliance with requirements for privacy, security and audit.

  • Design, implement and optimize comprehensive automation solutions using platform infrastructure and service offerings.

  • Plan, develop and promote automation in the areas of cloud resource provisioning (IaC), unit testing, regression, code coverage, static code analysis and vulnerability scanning through fully automated CI/CD processes.

  • Implement comprehensive monitoring and Data quality checks to make sure the quality is of utmost priority.

  • Research, evaluate and conduct POCs to recommend new technologies and tools to improve and optimize solutions.

  • Promote and build reusable code and components through innersourcing to be used by multiple project teams.

  • Promote a culture of best practices with peer code reviews and extreme ownership for continuous incremental delivery.

  • Document and maintain standards, best practices, and design/architecture patterns.

  • Motivate staff by providing a creative space with guidance and mentoring for skill development and growth.

Desired Skills

  • 10+ years of hands-on software development with Bachelor's or Master’s degree in Computer Science, Software Engineering, or a related field

  • Must have strong cloud Implementation expertise in cloud architecture

  • Must have very good knowledge of storage, network, Compute services. Have sound knowledge in multi-zone, region-based designs.

  • Must have the ability to provide solutions utilizing best practices for resilience, scalability, cloud optimization and security.

Technical Skills

  • Extensive programming experience and desire to write and review code 40-50% of the time.

  • 8+ years hands on with 2-3 years leadership experience working with big data development and appreciation for Object Oriented and functional programming paradigms.

  • Hands-on experience with Scala/Java, Spark, Spark on K8s (EKS) and familiarity with Python.

  • Experience developing cloud-hosted (AWS), containerized applications and services.

  • AWS Components: EKS, EMR, EC2, S3 storage, Lambda, Relational Database Service, Simple Notification Service (SNS), Elasticache, etc.

  • Familiarity with Relational and Big data storage technologies: RDS, Postgres and columnar Storage and file formats.

  • Must have a strong analytical and technical knowledge of data modeling, Big Data tech stack with passion to conduct due diligence and troubleshoot issues.

  • Must have a very good knowledge of storage, network, and computer services. Have sound knowledge in multi-zone, region-based designs.

  • Demonstrates knowledge of CI/CD processes, testing frameworks and tools (Gitlab, jUnit, Terraform, JFrog, Jacoco, SonarQube, etc.)

  • Familiarity of Linux platforms with knowledge of shell scripting.

  • Knowledge of networking and protocols: HTTP/HTTPS, TLS/SSL/certificates, TCP/IP ○ Experience with source code control tools. Gitlab experience is a plus.

Mindset and attributes

  • Strong communication skills with ability to communicate complex technical concepts and align organization on decisions

  • Sound problem-solving skills with the ability to process complex information, articulate and present it clearly

  • Utilizes team collaboration to create innovative solutions efficiently

  • Passion to research and conduct POCs to optimize solutions

DirectEmployers