BI Warehouse Architect at DCI Donor Services in WEST SACRAMENTO, California

Posted in Other about 2 hours ago.

Type: Full-Time





Job Description:


DCI Donor Services

Sierra Donor Services (SDS) is looking for a dynamic and enthusiastic team member to join us to save lives!! Our mission at Sierra Donor Services is to save lives through organ donation and we want professionals on our team that will embrace this important work!! We are currently seeking an BI Warehouse Architect. The BI Warehouse Architect is responsible for designing, deploying, and maintaining advanced data solutions, including data warehouse and data management processes, to support the business intelligence team with reporting, and advanced analytics requirements. This position will be onsite in Sacramento, CA.

COMPANY OVERVIEW AND MISSION

For over four decades, DCI Donor Services has been a leader in working to end the transplant waiting list. Our unique approach to service allows for nationwide donation, transplantation, and distribution of organs and tissues while maintaining close ties to our local communities.

DCI Donor Services operates three organ procurement/tissue recovery organizations: New Mexico Donor Services, Sierra Donor Services, and Tennessee Donor Services. We also maximize the gift of life through the DCI Donor Services Tissue Bank and Tennessee Donor Services Eye Bank.

Our performance is measured by the way we serve donor families and recipients. To be successful in this endeavor is our ultimate mission. By mobilizing the power of people and the potential of technology, we are honored to extend the reach of each donor’s gift and share the importance of the gift of life.

We are committed to diversity, equity, and inclusion. With the help of our employee-led strategy team, we will ensure that all communities feel welcome and safe with us because we are a model for fairness, belonging, and forward thinking.

Key responsibilities this position will perform include:

  1. Adheres to data engineering principles and practices.
  2. Design, build, deploy, automate, and maintain end-to-end data pipelines for new and existing data sources and targets utilizing modern ETL/ELT tools and practices, including stream processing technologies where appropriate.
  3. Demonstrates problem solving ability that allows team for timely and effective issue resolution.
  4. Drives and completes project deliverables within the data engineering & management area according to project plans.
  5. Utilize in-depth technical expertise regarding data models, master data management, metadata management, reference data management, and data warehousing.
  6. Work with internal technical resources to optimize the data warehouse through hardware or software upgrades or enhancements.
  7. Design and implement data models that balance performance, flexibility, and ease of use, considering both analytical and operational needs.
  8. Enable and support self-service analytics by designing intuitive data models and views, collaborating with the Business Intelligence team to ensure data is easily accessible and interpretable for business partners.
  9. Manage and automate the deployment of upgrades, patches, and new features across the data infrastructure, ensuring minimal disruption to data services and maintaining system integrity.
  10. Ensure compliance to all data warehouse administration activities.
  11. Manage and collect business metadata and data integration points.
  12. Coordinate with business intelligence to prepare technical design specifications to address user needs.
  13. Develop and implement comprehensive testing strategies, including automated unit, integration, and end-to-end tests, to ensure the accuracy, reliability, and performance of data pipelines and procedures. Provide technical support and coordination during warehouse design, testing, and movement to production.
  14. Create and maintain thorough, up-to-date documentation for all data engineering projects, processes, and systems, adhering to organizational standards and leveraging modern documentation tools and practices. Implement business rules via coding, stored procedures, middleware, or other technologies ensuring scalability and maintainability of implemented solutions.
  15. Performs other related duties as assigned.

The ideal candidate will have:

TECHNICAL SKILLS:

  1. Programming Languages:
    1. Advanced proficiency in SQL
    2. Preferred familiarity with Python
  2. Cloud Platforms:
    1. Strong experience with at least one major cloud platform (AWS, Azure, or GCP)
    2. Understanding of cloud-native architectures and services
  3. Data Warehousing and Lakes:
    1. Experience with modern data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery)
    2. Familiarity with data warehouse architectures and technologies
  4. ETL/ELT and Data Pipelines:
    1. Proficiency in designing and implementing scalable data pipelines
    2. Experience with ETL/ELT tools
  5. Database Systems:
    1. Strong knowledge of both relational (e.g., PostgreSQL, Oracle) and NoSQL (e.g., MongoDB, Cassandra) databases
    2. Experience with database optimization and performance tuning
  6. Data Modeling:
    1. Proficiency in dimensional modeling and data warehouse design
    2. Experience with data modeling tools
  7. Data Governance and Security:
    1. Understanding of data governance principles and practices
    2. Knowledge of data security and privacy best practices
  8. Machine Learning Operations (MLOps):
    1. Familiarity with MLOps practices and tools
  9. Data Visualization:
    1. Basic proficiency with data visualization tools (e.g., Power BI, Tableau)

PHYSICAL TRAITS: Reads, writes, listens and observes. Communicates using both verbal and technological avenues. Walks, stands, lifts, and carries light loads.

QUALIFICATIONS:

Education Required:

Bachelor's degree in Computer Science, Data Science, Engineering, or a related technical field. Master's degree is preferred but not required. Equivalent combination of education and experience may be considered.

Experience:

Minimum of 7 years of professional experience in data engineering, with at least 3 years in a senior or lead role.

Must have ten (10) years designing and implementing large-scale data pipelines and ETL processes

Must have minimum of three (3) years working with cloud-based platforms (e.g., AWS, Azure, GCP)

Must have ten (10) years implementing and maintaining data lakes and/or warehouses

Must have five (5) years using modern big data technologies such as Spark, Hadoop, etc.

Must have experience in applying data governance and security practices.

Must have experience documenting and keeping updated documentation on practices/processes for warehousing.

LICENSES/CERTIFICATION: Certifications in the following areas are preferred but not required

  • Data Engineering Certifications:
    • Google Certified Professional Data Engineer
    • AWS Certified Data Analytics – Specialty
    • Azure Data Engineer Associate
    • Cloudera Certified Professional (CCP) Data Engineer
  • Modern Data Warehouse Architecture Certifications:
    • Databricks Certified Professional Data Engineer
    • SnowPro Advanced: Data Engineer Certification
    • Microsoft Certified: Azure Data Engineer Associate
    • Google Cloud Certified - Professional Data Engineer
    • AWS Certified Data Analytics - Specialty
  • Advanced SQL Developer Certifications:
    • Microsoft Certified: Azure Database Administrator Associate
    • Oracle PL/SQL Developer Certified Associate
    • Databricks Certified Associate Developer for Apache Spark
    • Google Cloud Certified - Professional Cloud Database Engineer
    • AWS Certified Database – Specialty
  • Advanced Programming and ETL Development Certifications:
    • Python Institute PCPP – Certified Professional in Python Programming
    • RStudio Certified Professional Data Scientist
    • Informatica Certified Professional or Talend Certified Developer
    • AWS Certified Developer - Associate
    • Microsoft Certified: Azure Data Engineer Associate
  • Data Governance and Security Certifications:
    • ISACA Certified Information Systems Auditor (CISA)
    • Certified in Data Protection (CIPT)
    • Certified Information Systems Security Professional (CISSP)


We offer a competitive compensation package including:

  • Up to 176 hours of PTO your first year
  • Up to 72 hours of Sick Time your first year
  • Two Medical Plans (your choice of a PPO or HDHP), Dental, and Vision Coverage
  • 403(b) plan with matching contribution
  • Company provided term life, AD&D, and long-term disability insurance
  • Wellness Program
  • Supplemental insurance benefits such as accident coverage and short-term disability
  • Discounts on home/auto/renter/pet insurance
  • Cell phone discounts through Verizon
  • Monthly phone stipend

**New employees must have their first dose of the COVID-19 vaccine by their potential start date or be able to supply proof of vaccination.**

You will receive a confirmation e-mail upon successful submission of your application. The next step of the selection process will be to complete a video screening. Instructions to complete the video screening will be contained in the confirmation e-mail. Please note - you must complete the video screening within 5 days from submission of your application to be considered for the position.

DCIDS is an EOE/AA employer – M/F/Vet/Disability.





PI251046632

Salary:

$96,000.00


More jobs in WEST SACRAMENTO, California

Other
about 6 hours ago

FedEx
General Business
about 10 hours ago

Pape' DW, Inc
More jobs in Other

Other
less than a minute ago

Meharry Medical College
Other
less than a minute ago

Sanofi
Other
less than a minute ago

Meharry Medical College