Job Summary
We are looking for an experienced Databricks Developer (Core / Data Platform) to support and enhance an enterprise-grade data platform. The ideal candidate will have strong hands-on experience with Databricks, PySpark, Delta Lake, and Lakehouse architecture, along with exposure to DataOps practices in a production environment.
This is a remote offshore role, collaborating closely with global/onshore teams.
Job Location:
Remote (Offshore – India)
Experience Required:
5+ Years (Data Engineering)
3+ Years (Databricks – Mandatory)
Key Responsibilities
-
Design, develop, and maintain scalable data pipelines using PySpark and Spark SQL
-
Build and manage Delta Lake tables following Bronze, Silver, and Gold layer architecture
-
Support and enhance Lakehouse architecture and enterprise data platform standards
-
Apply DataOps best practices for deployment, monitoring, and reliability
-
Implement and manage Unity Catalog for data governance, access control, and lineage
-
Optimize Spark jobs for performance, scalability, and cost efficiency
-
Troubleshoot and support production data pipelines and workflows
-
Collaborate with onshore/global teams including data engineering, analytics, and platform teams
-
Maintain clear documentation and follow enterprise development standards

