SaluteMyJobs.com Jobs

Salute My Job

Job Information

Sabre Team Lead Software Systems Engineering in Bangalore, India

Req ID: 49577

Job Family: Information Technology/Software Development

Operating in 29 markets across Asia Pacific, with over 2000 local staff, serving premium top blue-chip customers in the travel industry - from travel agencies, airlines, hotels, car rentals and insurance providers. Sabre have grown our presence significantly over the years, in order to tap on the immense potential of the world’s fast growing region. Sabre’s end-to-end technologies help create a differentiated portfolio of solutions and services driven by data and insights on how travel business operates in Asia Pacific. We help our travel partners create better customer experiences, optimise business operations and enhance competitiveness. As we continue to grow in the region, our people are AND will always be our biggest asset & investment, where we continue to bring together the best talent and help them achieve their own career aspirations in a truly global company.

Job Description

Job Description:

Sabre Enterprise Data & Analytics Big Data Hadoop Platform team is primarily responsible for the Hadoop Big data lake platform and applications stability. We are looking to hire a Team Lead Hadoop administrator to provide application and system support. Hadoop administrator will also help developing scripts for platform automation. The main role of the Team Lead Hadoop Administrator is to investigate and diagnose issues with Big Data systems, data ingestion and transformation processes as a part of Hadoop Big data lake team. This work includes researching the problem, identifying root cause, developing and executing workarounds and documenting the work. When appropriate, the Hadoop Big data administrator will help to develop tools to facilitate implementation of workarounds. This includes working closely with Development and Architecture teams and internal Sabre stakeholders.

Responsibilities:

• Responsible for implementation and ongoing administration of Hadoop infrastructure.

• Work with the Dev teams to optimize cluster usage and ensure timely execution of business-critical workloads

• Install Hadoop updates, patches, and version upgrades as required

• Perform routine cluster maintenance, such as provisioning new nodes and performing HDFS backups and restores

• Routine cluster monitoring and troubleshooting

• Configuration and monitoring of job isolation, security, and resource queues

• Develop scripts and tools to automate common Hadoop administration tasks

• Participate in an on-call rotation with 24x7 availability requirements

• Screen Hadoop cluster job performances and capacity planning

• Monitor Hadoop cluster connectivity and security

• Manage and review Hadoop log files.

• File system management and monitoring.

• Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.

• Setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.

• Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools.

• Performance tuning of Hadoop clusters and Hadoop MapReduce routines.

• HDFS support and maintenance.

• Develop cloud formation, ansible and Cloudera navigator scripts for cloud migration and Hadoop platform automation.

• Clear understanding of security concepts including Kerberos, Key Trustee, TSL, LDAP and others.

• Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.

• Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.

• Point of Contact for Vendor escalation

Job Requirements

REQUIRED SKILLS:

• Five plus years’ experience with at least 4 years as a Hadoop Administrator

• Experience as an Architect building Enterprise scale, production quality Products and Solutions Development

• At least 3 years of Enterprise level Big Data Systems Architecture, Design and Development experience and 5+ years of Enterprise level Application, Database, Data Warehouse, and BI Development. Prefer candidates with hands-on Experience with Big Data Systems Such as Hadoop, Vertica, HBase, and Cassandra.

• Experience in XML, JMS (Message Queues) and Web Services technologies

• Experience with JUnit (any Java Unit Testing Framework), Test Driven development, maven

• Strong scripting skills, preferably in Python and Linux shell scripts and automation using scripting. Demonstrated knowledge of Linux operating systems.

• Demonstrated knowledge of Object-Oriented Analysis and Design

• Functional understanding of relational databases (Teradata, Vertica, Big Data, Oracle) or similar databases

• Proficient knowledge of complex SQL with any major RDBMS and SQL performance tuning

• Good Experience with OLAP concepts and methods

• Knowledge of Web Technologies such as HTML, CSS, JavaScript

• Design and development using Java development stack: Java EE, JSF, Hibernate, Aspect Oriented Programing (AOP), Webservices & Spring Framework.

• Experience in Hadoop Monitoring tools (Nagios, Ganglia, CDH CM)

• Experience in Web/App Server & SOA administration (Tomcat, JBoss, etc.)

• Knowledge of AWS, Azure, EMR, HDInsight’s.

• Knowledge of Big Data ecosystem (Hadoop, HDFS, Map Reduce, Yarn, Pig, Hive, Oozie, Sqoop, Flume) to derive insights/analytics from Big Data.

PREFERRED SKILLS:

• Proficient theoretical and practical knowledge of big data Hadoop, Cloud, Unix, data warehousing/database (preferably Teradata, Vertica) or large database systems

• Data Integration, Extraction, Load (ETL), Transformation using Talend or similar tools such as BODI, Informatica, Data Stage, SSIS, Talend or Java Spring technologies

• Data integration skills with messaging queue and SOA

• Cloud knowledge AWS (S3, EMR, Lake Formation), Azure (HDInsight) or Google Cloud.

Sabre Corporation is the leading technology provider to the global travel industry. Sabre’s software, data, mobile and distribution solutions are used by hundreds of airlines and thousands of hotel properties to manage critical operations, including passenger and guest reservations, revenue management, flight, network and crew management. Sabre also operates a leading global travel marketplace, which processes more than US$120 billion of global travel spend annually by connecting travel buyers and suppliers. Headquartered in Southlake, Texas, USA, Sabre serves customers in more than 160 countries around the world.

Stay connected with Sabre Careers

DirectEmployers