Careers at Azimuth

Databricks Lead

Location:

Jacksonville, FL

About Azimuth:

Azimuth is revolutionizing the world of regulatory compliance with automated compliance management technology. Our innovative solution helps companies comply with federal and state laws and ensures that every customer receives fair and equitable experiences. We are an Equal Opportunity Employer and provides Inclusive, Conducive and Diverse Work Culture. We proudly reward employees with equity compensation when the company succeeds.

We are seeking stand-out individuals to join our well-funded, growing startup. Qualified candidates must possess an insatiable desire to innovate, execute and follow-through. The right candidates detest manual processes, outdated spreadsheets, mediocrity and moving at a glacial pace.

If you are interested in joining a team that is transforming the culture of compliance and driving equity, Azimuth is for you. If this role suites your career aspirations, please apply to this position – our HR Team will get in touch with you soon.

Key Responsibilities:

  • Conceptualize and communicate Data architecture strategy, technical design and technology roadmap for data platform solutions and services.
  • Lead design and development of data architecture to support implementation of large-scale data solutions in Databricks to support multiple use cases (delta lake, cloud Datawarehouse migrations, reporting and analytics).
  • Guiding the organization and development teams on the overall data processes, architectural approaches for data strategy design and implementation.
  • Providing data solutions that enable business intelligence data integrations, data services, self-service analytics, and data-driven digital products and services.
  • Articulating the value proposition of cloud modernization/transformation to stakeholders; creating detailed documentation that empowers other engineers and customers.
  • Architecting the solutions for Databricks-specific use cases (streaming and batch) as well as integrating it into other solution components in the customer architecture landscape.
  • Translating the advanced business data, integration and analytics problems into technical approaches that yield actionable recommendations, across multiple, diverse domains.
  • Implement architectural improvements for existing solutions using legacy modernization and cloud adoptions strategies.
  • Creating Azure Data Factory processes for SQL and NoSQL data pipelines, and work in Azure SQL, to create tables, views, stored procedures, and other database objects.
  • Creating applications through Scala, Python, and Java.
  • Developing Spark tasks for data aggregation and transformation
  • Writing Scala style documents with each code.
  • Identifying technical issues (performance bottlenecks, data discrepancies, system defects) in data and software, perform root cause analysis, and communicate results effectively to the development team and other stakeholders
  • Providing recommendations on opportunities and improvements and participating in technology and analytics proof-of-concepts 

Qualification and Experience:

  • Bachelor’s Degree in Computer Science, Data, Information Systems, or related field
  • 8 years of experience with Azure Databricks, Azure Data Factory, and Pyspark
  • 5 years’ experience working in Microsoft Azure
  • 3 years’ experience with Python to extract data from unstructured files, such as XML, PDF, HTML
  • Highly proficient in use of Spark and Azure Databricks
  • Experience with deployment and builds using Azure DevOps
  • Experience in an agile development environment, or an understanding of the concepts of Agile software development
  • Excellent communication, organization, technical, and project management skills
  • Experience in leading projects relating to cloud modernization, data migration, data warehousing experience with cloud-based data platforms (Databricks/Apache Spark); preferably Databricks certified; experience driving technical workshops with technical and business clients to derive value added services and implementation.
  • Hands-on working knowledge of topics such as data security, messaging patterns, ELT, Data wrangling and cloud computing and proficiency in data integration/EAI and DB technologies, sophisticated analytics tools, programming languages or visualization platforms
  • Knowledge of the Databricks ecosystem

Apply Now: