Interim Azure Data Platform Administrator / Data Engineer (gn)
Interim Azure Data Platform Administrator / Data Engineer (gn)
Interim Azure Data Platform Administrator / Data Engineer (gn)
Interim Azure Data Platform Administrator / Data Engineer (gn)
Michael Page
Maschinenbau, Betriebstechnik
Essen
- Art der Beschäftigung: Selbstständig
- Hybrid
- Zu den Ersten gehören
Interim Azure Data Platform Administrator / Data Engineer (gn)
Über diesen Job
Intro
Exciting CompanyExciting Oppurtunities
Firmenprofil
Start: 30 April 2026 (or earlier if possible)
Project Duration: Until 24 December 2026
Workload: 16-18 hours per week (min. 2 days/week, ideally 3-4 hours/day)
Location: Remote
Industry: Engineering / Technology / Data Platforms
Project Language: English (fluent)
Aufgabengebiet
Responsibilities
- Design and implement scalable data storage solutions in Azure
- Develop data processing solutions across structured, unstructured, and streaming sources
- Integrate, transform, and consolidate data to deliver analytics‑ready models
- Design and operate secure, compliant, and high‑performing data pipelines
- Optimize platform stability, system efficiency, and data quality
- Implement file partitioning strategies in Azure Synapse Analytics
- Identify partitioning requirements in ADLS Gen2
- Use SQL Serverless and Spark clusters to create and run queries
- Develop and operate incremental data loads
- Transform data using Apache Spark and/or T‑SQL
- Ingest and transform data using Synapse Pipelines
- Implement duplicate‑handling and error‑handling mechanisms
- Develop batch processing solutions using ADLS Gen2
- Work with Delta Lake (read/write, incremental updates)
- Implement and operate Azure Synapse Link
- Create, schedule, and monitor data pipelines
- Integrate Python notebooks into data pipelines
- Trigger and validate batch runs; handle failures
- Monitor and optimize pipeline and query performance
Anforderungsprofil
Required Qualifications
- 3+ years of hands‑on experience with Azure Synapse Workspace (pipelines, notebooks, SQL endpoints, lake database)
- Strong SQL and Python experience
- Deep practical experience with:
- Pipelines
- Notebooks
- ADLS Gen2
- SQL endpoints
- Lake Database
- Strong English communication skills (written & spoken)
- Experience in designing, building, and operating modern Azure data platforms
- Ability to independently diagnose and optimize data pipelines and storage strategies
Nice‑to‑Have Skills
- Airflow
- Spark Delta Table libraries
- Certifications such as DP‑700 (Microsoft Fabric Data Engineering)
Vergütungspaket
Does the project sound interesting?
I look forward to your response with the following information:
- Your earliest availability
- Your maximum weekly capacity
- Can you regularly deliver this workload?
- Your hourly rate (remote)
- Your current profile (PDF)
- A brief comment on your suitability (referencing the tasks & requirements listed above)