top of page

Databricks Data Engineer

Hyderabad, Telangana, India

Job Type

Full Time

Workspace

In Office

About the Role

Key Responsibilities:
Contribute to the development and upkeep of a scalable data platform, incorporating tools and frameworks that leverage Azure and Databricks capabilities.
Exhibit proficiency in various RDBMS databases such as MySQL and SQL-Server, emphasizing their integration in applications and pipeline development.
Design and maintain high-caliber code, including data pipelines and applications, utilizing Python, Scala, and PHP.
Implement effective data processing solutions via Apache Spark, optimizing Spark applications for large-scale data handling.
Optimize data storage using formats like Parquet and Delta Lake to ensure efficient data accessibility and reliable performance.
Demonstrate understanding of Hive Metastore, Unity Catalog Metastore, and the operational dynamics of external tables.
Collaborate with diverse teams to convert business requirements into precise technical specifications.

Additional Qualifications :
Acquaintance with data security and privacy standards.
Experience in CI/CD pipelines and version control systems, notably Git.
Familiarity with Agile methodologies and DevOps practices.
Competence in technical writing for comprehensive documentation.

Requirements

5 years experience

  • Bachelor’s degree in Computer Science, Engineering, or a related discipline.


  • Demonstrated hands-on experience with Azure cloud services and Databricks.


  • Proficient programming skills in Python, Scala, and PHP.


  • In-depth knowledge of SQL, NoSQL databases, and data warehousing principles.


  • Familiarity with distributed data processing and external table management.


  • Insight into enterprise data solutions for PIM, CDP, MDM, and ERP applications.


  • Exceptional problem-solving acumen and meticulous attention to detail.

About the Company

We are actively seeking a self-motivated Data Engineer with expertise in Azure cloud and Databricks, with a thorough understanding of Delta Lake and Lake-house Architecture. The ideal candidate should excel in developing scalable data solutions, crafting platform tools, and integrating systems, while demonstrating proficiency in cloud-native database solutions and distributed data processing.

bottom of page