Enterprise Data Analytics & AI Developer/Architect

  • -
  • Full-Time
  • Remote
  • 140,000-170,000 USD / Year

Job Description:

Enterprise-scale data, analytics, and AI leadership role

Tech Partners is partnering with a client to identify an Enterprise Data Analytics and AI Developer who will lead the design and delivery of secure, scalable, and production-ready data and AI solutions across the organization. This role sits at the intersection of data engineering, analytics, AI application development, and enterprise architecture, with a strong emphasis on Microsoft Fabric and Microsoft AI Foundry.

You'll work cross-functionally with enterprise architecture, engineering, governance, and business teams to translate business outcomes into robust technical solutions. This is a hands-on, senior-level role with responsibility for strategy, solution delivery, technical leadership, and operational excellence.

The interview process includes an in-person interview in San Diego.

---

What You'll Be Doing

Enterprise Strategy & Architecture

  • Partner with enterprise architecture teams to define enterprise data and AI roadmaps.
  • Develop reference architectures and design patterns for Microsoft Fabric and AI Foundry.
  • Establish data service standards, semantic modeling conventions, and AI model lifecycle policies.
  • Influence enterprise security, privacy, and compliance for data and AI workloads.

Technical & Project Leadership

  • Lead end-to-end delivery from discovery and design through build, release, and production support.
  • Own technical quality gates, including design reviews, security reviews, and production readiness.
  • Drive non-functional requirements such as performance, scalability, cost optimization, and observability.
  • Coordinate integrations with third-party data sources and enterprise systems.

Microsoft Fabric & Analytics

  • Design and implement Lakehouse and Warehouse architectures in OneLake.
  • Build ETL pipelines using Data Factory, notebooks, shortcuts, and data mirroring.
  • Develop and optimize Power BI semantic models and datasets.
  • Implement real-time and operational analytics using KQL.
  • Harden solutions with RBAC, sensitivity labels, RLS/OLS, OAuth, SAML, and governance tooling.
  • Automate CI/CD for Fabric assets using deployment pipelines.

AI, Copilot & Microsoft AI Foundry

  • Design, configure, and deploy custom copilots using Microsoft Copilot Studio.
  • Integrate copilots into Teams and SharePoint experiences.
  • Select, evaluate, and manage models using AI Foundry model catalog and control plane.
  • Build and operate single- and multi-agent solutions with Agent Service.
  • Implement RAG solutions using Azure AI Search and vector indices.
  • Configure observability, evaluations, guardrails, and data leakage prevention.
  • Design and train ML models to solve business problems.

DevOps, Quality & Operations

  • Establish CI/CD using GitHub for data and AI assets.
  • Implement automated testing, monitoring, logging, and runbooks.
  • Enable cost observability and capacity right-sizing.
  • Support UAT, cutover, incident response, and production operations.

---

Top Key Requirements

Experience & Background

  • 7+ years of experience in enterprise data engineering, analytics engineering, and/or AI application development.
  • Proven delivery of production-grade solutions in Azure environments.
  • Strong experience working across large, complex enterprise organizations.

Technical Expertise

  • Microsoft Fabric: OneLake, Data Factory, Lakehouse, Warehouse, KQL/Real-Time Intelligence, Power BI semantic models.
  • Microsoft AI Foundry: Model catalog, Agent Service, evaluations/observability, guardrails, and control plane.
  • Copilot Studio: Building and deploying custom copilots integrated into Teams/SharePoint.
  • RAG implementations using Azure AI Search and vector databases.
  • Strong SQL, Python, and KQL skills.
  • Deep understanding of data modeling (Kimball, EDW, Streaming, Lakehouse).
  • CI/CD, automated testing, IaC, monitoring, and operational readiness.
  • Data governance, privacy, and security (RBAC, sensitivity labels, RLS/OLS, DLP).

Soft Skills

  • Strong communication skills with the ability to explain complex technical concepts to non-technical stakeholders.
  • Proven ability to lead, mentor, and influence across teams.

Additional Requirements

  • Must be able to attend an in-person interview in San Diego.
  • Valid driver's license and active personal auto insurance required.
  • Ability to pass a background check and pre-employment drug screening.

Preferred

  • Bachelor's degree in Computer Science, Information Systems, or related field.
  • Microsoft certifications (Azure Data Engineer, Azure AI Engineer, Fabric).
  • Experience with Purview, DLP, compliance frameworks, and regulated environments.
  • Experience integrating ERPs and other enterprise systems.

---

Compensation 

  • $140,000–$170,000 annually, depending on experience, skills, certifications, and location.

Qualified candidates should email their resume or questions to

[email protected]