Skip to content

DataOps Lead

Sigmoid
Bengaluru, Karnataka, IndiaApr 1, 2026·Posted 11 days ago
View Application Page

Domain

Tech Stack

AWSGCPJenkinsDockerKubernetesHadoopHDFSMapReduceOozieHiveImpalaSparkKafkaKerberos

Must-Have Requirements

  • 8-12 years of relevant work experience
  • Computer Science or related technical discipline
  • Experience in Shell, Python or any scripting language
  • Experience managing Linux systems, build and release tools like Jenkins
  • Effective communication skills (both written and verbal)
  • Ability to collaborate with diverse engineers, data scientists and product managers
  • Comfort in fast-paced startup environment

Nice to Have

  • -Support experience in Big Data domain
  • -Architecting, implementing and maintaining Big Data solutions
  • -Experience with Hadoop ecosystem (HDFS, MapReduce, Oozie, Hive, Impala, Spark, Kerberos, Kafka)
  • -Experience with container technologies like Docker, Kubernetes and configuration management systems
  • -Proven track record of building and shipping large-scale engineering products
  • -Knowledge of cloud infrastructure such as GCP/AWS

Description

As a DataOps Lead, you will be responsible for managing, design highly-scalable and Available solution for data pipelines that provides the foundation for collecting, storing, modeling, and analyzing massive data sets from multiple channels. This position reports to Devops Architect.

Responsibilities

Align Sigmoid with key Client initiatives Interface daily with customers across leading Fortune 500 companies to understand strategic requirements Connect with VP and Director level clients on a regular Travel to client locations Ability to understand business requirements and tie them to technologysolutions Strategically support Technical Initiatives Design, manage & deploy highly scalable and fault-tolerant distributed components using Big data Ability to evaluate and choose technology stacks that best fit client data strategy and constraints Drive Automation and massive deployments Ability to drive good engineering practices from bottom up Develop industry leading CI/CD, monitoring and support practices inside the team Develop scripts to automate devops processes to reduce team effort Work with the team to develop automation and resolve issue Support TB scale pipelines Perform root cause analysis for production errors Support developers in day to day devops operations Excellent experience in Application support, integration development and data Design roster and escalation matrix for team Provide technical leadership and manage it day to day basis Guiding devops in day to day design, automation b support tasks Play a key role in hiring technical talents to build the future of Conduct training for technology stack for developers in house and outside Culture Must be a strategic thinker with the ability to think unconventional / out:of:box. Analytical and data driven Raw intellect, talent and energy are Entrepreneurial and Agile : understands the demands of a private, high growth Ability to be both a leader and hands on "doer". Qualifications: - 8 - 12 years track record of relevant work experience and a computer Science or a related technical discipline is required Proven track record of building and shipping large-scale engineering products and/or knowledge of cloud infrastructure such as GCP/AWS preferred Experience in Shell, Python or any scripting language Experience in managing linux systems, build and release tools like jenkins Effective communication skills (both written and verbal) Ability to collaborate with a diverse set of engineers, data scientists and product managers Comfort in a fast-paced start-up environment

Preferred Qualification

Support experience in Big Data domain Architecting, implementing and maintaining Big Data solutions Experience with Hadoop ecosystem (HDFS, MapReduce, Oozie, Hive, lmpala, Spark, Kerberos, KAFKA, etc) Experience in container technologies like Docker, Kubernetes Et configuration management systems

Location Context