MSC Data Engineer in Auckland. Provoke Solutions is seeking a talented and motivated Data Engineer to join our Auckland City team. If you are passionate about building scalable data pipelines, designing robust data lakes, and leveraging cutting-edge technologies like MS Fabric to solve complex business problems, this role offers the perfect opportunity to accelerate your career in a dynamic, AI-driven consulting environment.
About Company
Provoke Solutions is a global consulting firm dedicated to creating AI-native solutions that transform how work gets done. With a culture rooted in innovation, curiosity, and growth, we partner with clients worldwide to embed agentic AI directly into workflows. This enables teams to move faster, think smarter, and scale with purpose. Our focus on diversity, collaboration, and continuous learning ensures a high-performing environment where talented individuals can thrive and make a tangible impact.
Responsibilities
- Design, implement, and manage data lake solutions using MS Fabric.
- Develop and maintain ETL pipelines to extract, transform, and load structured and unstructured data from multiple sources.
- Collaborate with cross-functional teams to understand data requirements and integrate new data sources into the data lake.
- Optimize data storage and retrieval processes to improve performance and scalability.
- Ensure data integrity, security, and availability by applying best practices in data governance.
- Perform data profiling, cleansing, and transformation to maintain high data quality standards.
- Monitor and troubleshoot data flows to ensure reliable pipeline operation.
- Stay current with emerging trends and technologies in data engineering, particularly around MS Fabric and data lake architecture.
Requirements
- Proven experience as a Data Engineer, with strong knowledge of data lake architecture and development.
- Hands-on experience with MS Fabric and building scalable data lakes.
- Proficiency in data integration techniques for structured and unstructured sources (APIs, databases, cloud services).
- Strong programming skills in Python, SQL, or relevant languages.
- Experience with cloud platforms (Azure, AWS, Google Cloud) for data storage and processing.
- Solid understanding of ETL processes, data warehousing, and big data technologies.
- Strong experience with Power BI and DAX for data visualization.
- Knowledge of data governance principles.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities.
Skills
- Data architecture and pipeline design.
- ETL development and data integration.
- Programming: Python, SQL, or similar languages.
- Data visualization using Power BI and DAX.
- Cloud computing (Azure, AWS, Google Cloud).
- Strong analytical, troubleshooting, and problem-solving abilities.
- Effective communication and teamwork skills.
Benefits
- Work in a cutting-edge, AI-driven technology environment.
- Collaborate with a diverse, high-performing team.
- Access to professional development and learning opportunities.
- Competitive compensation with career growth potential.
- Opportunity to work on innovative projects with global impact.
