DATabricks lead
Azimuth is on a Mission to Reinvent Regulatory Compliance through Technology.
Location:
REMOTE
About Azimuth:
Lead the charge in compliance tech innovation. We’re looking for a Databricks Lead who’s all about elevating the game for Azimuth—from our hub in Jacksonville, FL.
What’s in it for you?
- A chance to architect the future of compliance technology.
- The opportunity to turn cutting-edge ideas into market-leading realities.
- A role where you drive major tech decisions, shape strategies, and lead a powerhouse team.
You are:
- A Databricks Lead with a solid track record of implementing business rule-based solutions.
- A professional who thrives on analyzing data and creating functional specifications.
- Excited to deliver solutions that reshape how industries approach compliance.
Why Azimuth? Because here, your vision impacts not just Azimuth but an entire industry. We’re not just building systems; we’re setting standards.
Job Requirements:
- Conceptualize and communicate Data architecture strategy, technical design and technology roadmap for data platform solutions and services.
- Lead design and development of data architecture to support implementation of large-scale data solutions in Databricks to support multiple use cases (delta lake, cloud Datawarehouse migrations, reporting and analytics).
- Guiding the organization and development teams on the overall data processes, architectural approaches for data strategy design and implementation.
- Providing data solutions that enable business intelligence data integrations, data services, self-service analytics, and data-driven digital products and services.
- Articulating the value proposition of cloud modernization/transformation to stakeholders; creating detailed documentation that empowers other engineers and customers.
- Architecting the solutions for Databricks-specific use cases (streaming and batch) as well as integrating it into other solution components in the customer architecture landscape.
- Translating the advanced business data, integration and analytics problems into technical approaches that yield actionable recommendations, across multiple, diverse domains.
- Implement architectural improvements for existing solutions using legacy modernization and cloud adoptions strategies.
- Creating Azure Data Factory processes for SQL and NoSQL data pipelines, and work in Azure SQL, to create tables, views, stored procedures, and other database objects.
- Creating applications through Scala, Python, and Java.
- Developing Spark tasks for data aggregation and transformation
- Writing Scala style documents with each code.
- Identifying technical issues (performance bottlenecks, data discrepancies, system defects) in data and software, perform root cause analysis, and communicate results effectively to the development team and other stakeholders
- Providing recommendations on opportunities and improvements and participating in technology and analytics proof-of-concepts
Qualification and Experience:
- Bachelor’s Degree in Computer Science, Data, Information Systems, or related field
- 8 years of experience with Azure Databricks, Azure Data Factory, and Pyspark
- 5 years’ experience working in Microsoft Azure
- 3 years’ experience with Python to extract data from unstructured files, such as XML, PDF, HTML
- Highly proficient in use of Spark and Azure Databricks
- Experience with deployment and builds using Azure DevOps
- Experience in an agile development environment, or an understanding of the concepts of Agile software development
- Excellent communication, organization, technical, and project management skills
- Experience in leading projects relating to cloud modernization, data migration, data warehousing experience with cloud-based data platforms (Databricks/Apache Spark); preferably Databricks certified; experience driving technical workshops with technical and business clients to derive value added services and implementation.
- Hands-on working knowledge of topics such as data security, messaging patterns, ELT, Data wrangling and cloud computing and proficiency in data integration/EAI and DB technologies, sophisticated analytics tools, programming languages or visualization platforms
- Knowledge of the Databricks ecosystem