| Kód pozície | DE0001 |
| Predpokladaná odmena | od 6000 € |
| Lokalita | Bratislava |
| Utilizácia | Fulltime |
| Dostupnosť na projekte | Hybrid |
| Nástup na projekt | ASAP |
| Dĺžka alokácie | long-term |
We are seeking experienced Data Engineers to support our data platform initiatives on
Microsoft Azure using Databricks. Ideal candidates will have a strong background in building
scalable data pipelines, implementing medallion architecture, and migrating legacy data
platforms to modern, cloud-based solutions.
Responsibilities
• Design, develop, and maintain scalable data pipelines using Databricks on Azure
• Implement and optimize medallion architecture (bronze, silver, gold layers)
• Migrate data marts and data warehouses from on-premises platforms such as Oracle
and Hadoop to Azure Databricks
• Perform ETL/ELT processes for data extraction, transformation, and loading from
legacy systems
• Ensure data quality, governance, and security across the platform
• Collaborate with data architects, analysts, and business stakeholders to understand
data requirements
• Support CI/CD practices and automation for data workflows
• Troubleshoot and resolve data-related issues in both production and development
environments
Base Requirements
• Minimum 5 years of experience in data engineering roles
• Hands-on experience with Databricks on Microsoft Azure
• Strong proficiency in SQL and Python for data manipulation and pipeline
development
• Proven experience designing and implementing medallion architecture
• Experience with Delta Lake, Apache Spark, and Databricks notebooks
• Deep understanding of Azure Data Lake, Azure Data Factory, and Azure DevOps
• Experience migrating data from Oracle, Hadoop, or other legacy platforms to cloud
environments
• Familiarity with data modeling, schema conversion, and data validation during
migrations
• Knowledge of data governance, security, and compliance in cloud environments
• Experience with unit testing, version control (Git), and CI/CD pipelines
• Excellent communication skills in English (written and spoken)
Preferred Requirements
• Certifications such as Azure Data Engineer Associate, Databricks, or tools for
Oracle/Hadoop migration
• Experience with real-time data processing (e.g., Kafka, Structured Streaming)
• Familiarity with MLflow, Unity Catalog, and Databricks Workflows
• Experience in data warehouse modernization and cloud-native arc
If you are interested in this job position, fill in the following form or send us your CV to the e-mail address info@biscom.sk.