We are seeking qualified DevOps engineers to join our growing Cloud Data Operations and services team in Bangalore, India, as we continue our rapid growth with an expansion of our Indian subsidiary, phData Solutions Private Limited. This expansion comes at the right time with increasing customer demand for data and platform solutions.
In addition to the phenomenal growth and learning opportunities, we offer a competitive compensation plan, including base salary, annual bonus, training, certifications, and equity.
As a DevOps Engineer on our Consulting Team, you will be responsible for technical delivery for technology projects related to Snowflake, Cloud Platform (AWS/Azure/), and services hosted on Cloud.
Responsibilities:
- Operate and manage modern data platforms - from streaming, to data lakes, to analytics, and beyond across a progressively evolving technical stack.
- Ability to learn new technologies in a quickly changing field
- Owns execution of Tasks and questions around Tasks other Engineers are working on related to the project.
- Responds to Pager Incidents. Solves hard and challenging problems. Goes deep into customer processes and workflows to solve issues.
- Demonstrate clear ownership of tasks on multiple simultaneous customer accounts across a variety of technical stacks.
- Continually grow, learn, and stay up-to-date with the MS technology stack.
- 24/7 rotational shifts
Required Experience:
- Working knowledge of SQL and the ability to write, debug, and optimize SQL queries.
- Good understanding of writing and optimising Python programs.
- Experience in providing operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift).
- Experience with cloud-native data technologies in AWS or Azure.
- Proven experience learning new technology stacks.
- Strong troubleshooting and performance tuning skills.
- Client-facing written and verbal communication skills and experience.
Preferred Experience:
- Production experience and certifications in core data platforms such as Snowflake, AWS, Azure, GCP, Hadoop, or Databricks.
- Production experience working with Cloud and Distributed Data Storage technologies such as S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems.
- Production experience working with Data integration technologies such asSpark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory or others.
- Production experience working with Workflow Management and Orchestration such as Airflow, AWS Managed Airflow, Luigi, NiFi.
- Working experience with infrastructure as code using Terraform or Cloud Formation.
- Expertise in scripting language to automate repetitive tasks (preferred Python).
- Well versed with continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, Liquibase.
- Bachelor's degree in Computer Science or a related field