We require an architect to support several internal initiatives : We are merging several operational dashboards / insights applications from stand-alone systems to a centralized data platform.
We require an architect to support that transition, helping to design the new systems and supporting the teams through the transition.
We are moving legacy data systems from private to cloud data-centers. As we change these systems, we want to modernize the deployment, packaging and testing processes, including creating CI / CD pipelines for these tools.
Role and Responsibilities
Develop new standards around CI / CD pipelines for data-producers.
Develop standards around monitoring, alerting, logging and issue tracking.
5+ years of experience with data warehouses and data lakes.
5+ years of experience with reporting technology.
5+ years of experience on Azure infrastructure, integration patterns, application communication and security model.
3+ years of experience designing stream processing systems (Databricks, Spark, Kafka)
5+ years of experience with CI / CD pipelines for data-systems.
5+ years of experience with ETL systems.
3+ years of experience with operational monitoring, logging, and alerting systems.
3+ years of experience working in a DevOps environment.
Hands-on, ability to dive into existing code and improve on it.
Strong communication skills including presentation and writing skill.
Experience mentoring engineering teams to enable them to deliver on architectural strategy.
Azure Networking, Azure Data Factory (ADF), Azure Data bricks Apache Kafka, Microsoft Power BI, Azure Key Vault (AKV), Azure Data Lake Gen 2 (ADLS), Azure File Share (AFS), Azure Active Directory (AAD), Single Sign-On Using Azure AD, Java, Python