Senior Engineer - Big Data Operations

Senior Engineer - Big Data Operations

Team Pasona India

Senior Engineer - Big Data Operations

Details of the offer

Roles and Responsibilities
Serve as a lead in Operations team, supporting and operating key aspects of infrastructure services including security, capacity planning, availability, and performance.
Co-ordinate shift, rotation with teams in different geographical areas.
Create procedures/run books for operational and security aspects of platform.
Improve infrastructure by developing and improving automation tools

Provide advanced business and engineering support services to end users:
Gather business details
Lead other admins and platform engineers through design and implementation decisions to achieve balance between strategic design and tactical needs
Research and deploy new tools and frameworks to build a sustainable big data platform
Assist with creating programs for training and onboarding for new end users
Lead Agile/Kanban workflows and team process work
Troubleshoot issues to resolve problems
Provide status updates to Operations product owner and stakeholders
Track all details in the issue tracking system (JIRA)
Provide issue review and triage problems for new service/support requests
Use DevOps automation tools, including Jenkins build jobs
Fulfill any ad-hoc data or report request queries from different functional groups.
On call for 2nd or 3rd shifts
Working experience and good understanding of the AWS or Azure environment; Advanced experience with IAM policy and role management.
Security: Experience implementing role-based security, including AD integration, security policies, and auditing in a Linux/Hadoop/AWS environment. Familiar with penetration testing and scan tools for remediation of security vulnerabilities.
Infrastructure Operations: 5+ years supporting systems infrastructure operations, upgrades, deployments, and monitoring
Programming: 3+ years of experience with Java programming language.
Hadoop: Experience with Hadoop (Hive, Spark, Sqoop) and / or AWS EMR
DevOps: Experience with DevOps automation - Orchestration/Configuration Management and CI/CD tools (Jenkins)
Version Control: Working experience with one or more version control platforms like Github
ETL: Job scheduler experience like Oozie or Airflow; Nice to have Airflow experience
Data Science tools (nice to have): R, RStudio, Tensorflow
Monitoring: Hands on experience with monitoring tools such as AWS CloudWatch, Datadog and Elastic Search
Networking: Working knowledge of TCP/IP networking, SMTP, HTTP, load-balancers (ELB) and high availability architecture
Demonstrated successful experience learning new technologies quickly

Source: Careesma


  • Other Jobs / Other Jobs - Crafts


Related offers

Oracle technofunctional

Job Description Job Descriptions: Oracle Fusion Techno functional Shall have 6-7 implementations Shall have excellent knowledge on SCM/ASCM modules

From Param Info Computer Services Private Limited - Assam

Published 22 days ago

Cloud application solutions architect- startup ecosystem

Job Description Engage with the partner to build and execute a practice activation plan, accelerate accreditations and certifications and track deployments...

From Careerxperts Consulting - Karnataka

Published 22 days ago

Calypso devops engineer, bangalore

Job Description Project Description Calypso new functional implementation and technical upgrades on V14/V15. Cross-Asset Back Office system with Collateral...

From Luxoft India Llp - Karnataka

Published 22 days ago

Production, quality, maintenance

Job Description Job Description : Graduates Belonging toBE/B.Tech/DiplomaEngineering Freshers at Leading MNC Companies. Qualification- Mechanical, Automobile...

From Psw Global Solutions - Tamil Nadu

Published 22 days ago