CapB is looking for an experienced Azure Data and DevOps engineer. This is 100% Remote position. Job description below.
PRIMARY DUTIES 1 Create and maintain optimal data pipeline architecture. 2 Assemble large, complex data sets that meet functional / non-functional business requirements. 3 Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. 4 Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Big Data technologies. 5 Deliver automation & lean processes to ensure high quality throughput & performance of the entire data & analytics platform.? 6 Work with data and analytics experts to strive for greater functionality in our analytics platforms.
POSITION SPECIFICATIONS
*Experience with HADR in Azure - High Availability Disaster Recovery • Experience in building/operating/maintaining fault tolerant and scalable data processing integrations using Azure • Experienced using Azure Data Factory or Synapse Analytics • Experienced using Databricks & Apache Spark • Strong problem-solving skills with emphasis on optimization data pipelines • Excellent written and verbal communication skills for coordinating across teams • A drive to learn and master new technologies and techniques • Experienced in DevOps and Agile environments and using CI/CD pipelines. • Experience using Docker or Kubernetes is a plus • Demonstrated capabilities with cloud infrastructures and multi-cloud environments such as Azure, AWS, IBM cloud • Experience architecting transactional based data platforms • Experience architecting real-time/event streaming data platforms (IoT)
Apply for this Job
Please use the APPLY HERE link below to view additional details and application instructions.