Title: Staff Software Engineer (Data)
Bangalore, Karnataka, IN
Job Summary
Are you passionate about building the future of hybrid cloud data management? NetApp is developing a portfolio of data-centric platforms and services to help organizations unlock the true power of their data. Keystone team is at the forefront of this transformation, delivering innovative, subscription-based, pay-as-you-go solutions that give customers a seamless data management experience, whether on-premises or in the cloud. We are building the engine that powers this flexible consumption model, and we're looking for a technical visionary to help us solve the next generation of challenges in distributed systems, cloud services, and data analytics.
As a Staff Software Engineer (Data) in the Keystone organization, you will be a technical leader and a force multiplier for our data engineering organziation. You will go beyond leading projects; you will set the technical vision for critical components of our platform. You will be responsible for designing elegant, scalable, and resilient solutions to our most complex architectural challenges. This role requires a blend of deep technical expertise, a strategic mindset, and the ability to influence and mentor engineers across the organization. If you thrive on solving ambiguous problems, driving technical excellence, and building systems that operate at a massive scale, this is the role for you.
Job Requirements
- 10+ years of professional experience in data engineering or software engineering, with a proven track record of delivering high-scale distributed systems.
- Experience acting as a lead or key technical influencer, specifically guiding the architecture of large-volume data processing platforms.
- Demonstrated expertise in designing for high availability and fault tolerance in systems handling financial or mission-critical usage data.
- Deep, hands-on proficiency in Go and/or Python for building data-intensive applications.
- Proven Experience with stream processing technologies (e.g., Kafka, Flink, Spark Streaming) for real-time event ingestion.
- Proven experience with relational databases (PostgreSQL) and NoSQL stores (Cassandra, DynamoDB), with a deep understanding of optimizing for time-series and event data.
- Deep knowledge of modern data warehousing concepts (e.g., Snowflake, Redshift) and advanced data modeling for complex analytics.
- Hands-on experience with containerization and orchestration technologies, specifically Docker and Kubernetes, within a cloud-native environment (AWS, Azure, or GCP).
- Experience building and maintaining automated CI/CD pipelines and data orchestration tools (e.g., Airflow, GitLab CI, Jenkins).
- Familiarity with full-stack architecture and an understanding of how data layers interface with modern frontend frameworks like React.js.
- Strong understanding of data integrity, contracts and reconciliation patterns
- Bachelor of Science degree in Computer Science, Engineering, or a related field; or equivalent, relevant experience.
Education
- Typically requires a minimum of 10-15 years of related experience with a Bachelor’s degree or 8 years and a Master’s degree; or a PhD with 3 years experience; or equivalent experience.
- A Bachelor of Science degree in Computer Science, Engineering, or a related field; or equivalent, relevant experience.
Job Segment:
Software Engineer, Cloud, Data Modeler, Data Warehouse, Developer, Engineering, Technology, Data