Unlimited Job Postings Subscription - $99/yr!

Job Details

Technical Specialist- Junior Raleigh, NC (Hybrid)

  2026-02-05     My3Tech     all cities,AK  
Description:

Job: Databricks Administrator/Architect

Location: Raleigh, NC

  • ONLY SUBMIT CANDIDATES CURRENTLY LIVING IN THE RALEIGH/DURHAM/CHAPEL HILL, NC AREA.
  • The candidate must come onsite on the first day to collect equipment.
  • All candidates must be local to the Triangle region of North Carolina, and posting may require up to 1-2 days per month in a Triangle area office for meetings.
North Carolina Department of Transportation Database Team seeks a Databricks Administrator/Architect with proven skills for a 12-month engagement for creation/tuning & support of the Databricks environment. This position will be responsible for developing and designing the Databricks environment at NCDIT-T. This individual will work with internal staff to plan/design/maintain the Databricks environment and recommend changes needed to accommodate/grow as our business needs dictate. This individual will facilitate changes through DIT-T's change process and work very closely with the DBA & Development Staff regarding all aspects of the design and planning of the Databricks environment.

Responsibilities:
  • Provide mentorship, guidance, overall knowledge share, and support to team members, promoting continuous learning and development.
  • Oversee the design, implementation, and maintenance of Databricks clusters.
  • Ensure the platform's scalability, performance, and security.
  • Provide escalated support and troubleshooting to users.
  • Oversee maintenance of role-based access to data and features in the Databricks Platform using Unity Catalog.
  • Review clusters health check and best practices implementation.
  • Review and maintain documentation for users and administrators.
  • Design and implement tailored data solutions to meet customer needs and use cases, spanning from ingesting data from APIs, building data pipelines, analytics, and beyond within a dynamically evolving technical stack.
  • Work on projects involving on-prem data ingestion into Azure using ADF.
  • Build data pipelines based on the medallion architecture that clean, transform, and aggregate data from disparate sources.


SKILL MATRIX:
  1. Extensive hands-on experience implementing Lakehouse architecture using Databricks Data Engineering platform, SQL Analytics, Delta Lake, Unity Catalog - Required 5 Years
  2. Strong understanding of Relational & Dimensional modeling - Required 5 Years
  3. Demonstrate proficiency in coding skills - Python, SQL, and PySpark to efficiently prioritize perf, security, scalability, robust data integrations - Required 6 Years
  4. Experience implementing serverless real-time/near real-time arch. using Cloud (i.e., Azure, AWS, or GCP Tech Stack), and Spark tech (Streaming & ML) - Required 2 Years
  5. Experience Azure Infra config (Networking, architect and build large data ingestion pipelines and conducting data migrations using ADF or similar tech - Required 4 Years
  6. Experience working w/ SQL Server features such as SSIS and CDC - Required 7 Years
  7. Experience with Databricks platform, security features, Unity Catalog, and data access control mechanisms - Required 2 Years
  8. Experience with GIT code versioning software - Required 4 Years
  9. Databricks Certifications


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search