Senior Data Engineer

Location

Thiruvananthapuram, Kerala, India

Salary

60000 - 100000 a year (Indian Rupees)

Description

Join SADA India as a Senior Data Engineer, Enterprise Support service!

Your Mission 

As a Sr. Data Engineer on the Enterprise Support service team at SADA, you will reduce customer anxiety about running production workloads in the cloud by implementing and iteratively improving observability and reliability. You will have the opportunity to engage with our customers in a meaningful way by defining, measuring, and improving key business metrics; eliminating toil through automation; inspecting code, design, implementation, and operational procedures; enabling experimentation by helping create a culture of ownership; and winning customer trust through education, skill sharing, and implementing recommendations. Your efforts will accelerate our customers’ cloud adoption journey and we will be with them through the transformation of their applications, infrastructure, and internal processes. You will be part of a new social contract between customers and service providers that demands shared responsibility and accountability: our partnership with our customers will ensure we are working towards a common goal and share a common fate.

This is primarily a customer-facing role. You will also work closely with SADA’s Customer Experience team to execute their recommendations to our customers, and with Professional Services on large projects that require PMO support.

Pathway to Success 

#MakeThemRave is at the foundation of all our engineering. Our motivation is to provide customers with an exceptional experience in migrating, developing, modernizing, and operationalizing their systems in the Google Cloud Platform.

Your success starts by positively impacting the direction of a fast-growing practice with vision and passion. You will be measured bi-yearly by the breadth, magnitude, and quality of your contributions, your ability to estimate accurately, customer feedback at the close of projects, how well you collaborate with your peers, and the consultative polish you bring to customer interactions.

As you continue to execute successfully, we will build a customized development plan together that leads you through the engineering or management growth tracks.

Expectations

Customer Facing - You will interact with customers on a regular basis, sometimes daily, other times weekly/bi-weekly. Common touchpoints occur when qualifying potential opportunities, at project kickoff, throughout the engagement as progress is communicated, and at project close. You can expect to interact with a range of customer stakeholders, including engineers, technical project managers, and executives.

Onboarding/Training - The first several weeks of onboarding are dedicated to learning and will encompass learning materials/assignments and compliance training, as well as meetings with relevant individuals.

Job Requirements

Required Credentials:

  • Google Professional Data Engineer Certified or able to complete within the first 45 days of employment 
  • A secondary Google Cloud certification in any other specialization

Required Qualifications: 

  • 4+ years of experience in Cloud support
  • Experience in supporting customers preferably in 24/7 environments
  • Experience working with Google Cloud data products (CloudSQL, Spanner, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Bigtable, BigQuery, Dataprep, Composer, etc)
  • Experience writing software in at least two or more languages such as Python, Java, Scala, or Go
  • Experience in building production-grade data solutions (relational and NoSQL)
  • Experience with systems monitoring/alerting, capacity planning, and performance tuning
  • Experience with BI tools like Tableau, Looker, etc will be an advantage
  • Consultative mindset that delights the customer by building good rapport with them to fully understand their requirements and provide accurate solutions

Useful Qualifications:

    • Mastery in at least one of the following domain areas:
      • Google Cloud DataFlow: building batch/streaming ETL pipelines with frameworks such as Apache Beam or Google Cloud DataFlow and working with messaging systems like Pub/Sub, Kafka, and RabbitMQ; Auto scaling DataFlow clusters, troubleshooting cluster operation issues
      • Data Integration Tools: building data pipelines using modern data integration tools such as Fivetran, Striim, Data Fusion, etc. Must have hands-on experience configuring and integrating with multiple Data Sources within and outside of Google Cloud
      • Large Enterprise Migration: migrating entire cloud or on-prem assets to Google Cloud including Data Lakes, Data Warehouses, Databases, Business Intelligence, Jobs, etc. Provide consultations for optimizing cost, defining methodology, and coming up with a plan to execute the migration.
  • Experience with IoT architectures and building real-time data streaming pipelines
  • Experience operationalizing machine learning models on large datasets
  • Demonstrated leadership and self-direction -- a willingness to teach others and learn new techniques
  • Demonstrated skills in selecting the right statistical tools given a data analysis problem
  • Understanding of Chaos Engineering
  • Understanding of PCI, SOC2, and HIPAA compliance standards
  • Understanding of the principle of least privilege and security best practices
  • Understanding of cryptocurrency and blockchain technology

 



Please mention the word **HOTCAKE** and tag RNzQuMjA4LjYxLjE0Mw== when applying to show you read the job post completely (#RNzQuMjA4LjYxLjE0Mw==). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.

Job type:

Remote job

Tags

  • cryptocurrency
  • security
  • teach
  • technical
  • support
  • software
  • growth
  • cloud
  • management
  • senior
  • operational
  • engineer
  • engineering
  • apache
Sent 83 days ago
Back to index