Artificial Intelligence Engineer at Fractal in San Antonio, Texas

Posted in Other 2 days ago.

Type: full-time





Job Description:

Fractal is a strategic AI partner to Fortune 500 companies with a vision to power every human decision in the enterprise. Fractal is building a world where individual choices, freedom, and diversity are the greatest assets; an ecosystem where human imagination is at the heart of every decision. Where no possibility is written off, only challenged to get better. We believe that a true Fractalite is the one who empowers imagination with intelligence. Fractal has been featured as a Great Place to Work by The Economic Times in partnership with the Great Place to Work® Institute and recognized as a 'Cool Vendor' and a 'Vendor to Watch' by Gartner.

Please visit Fractal | Intelligence for Imagination for more information about Fractal.

Role Overview

As an AI Engineer at Fractal, a company committed to integrating artificial intelligence in decision-making across Fortune 500 companies, you will develop and deploy AI models using advanced Python programming. You will design and maintain data pipelines for a range of processes including ETL, model scoring, and performance monitoring, and work collaboratively with cross-functional teams to ensure AI solutions are effectively integrated into the existing infrastructure. This role requires a deep understanding of SQL, workflow orchestration tools like Airflow, and data storage technologies such as Snowflake and Hadoop, positioning you at the forefront of AI innovation and application in a dynamic corporate environment.

Responsibilities
  • Develop and deploy AI models leveraging a strong background in Python programming.
  • Design, implement, and maintain data pipelines for various purposes, including ETL processes, model scoring, model performance monitoring, data offloading, job scheduling, and automation.
  • Utilize SQL for data manipulation, querying, and analysis to support AI model development and optimization.
  • Collaborate with cross-functional teams to integrate AI solutions into existing infrastructure and workflows.
  • Familiarize with and utilize tools such as Airflow, Domino, and Control-M for workflow orchestration, scheduling, and automation.
  • Manage codebase using version control systems like Gitlab (or Git) and participate in CI/CD pipeline activities.
  • Work with data storage technologies such as Snowflake, Cloudera, Hadoop, HDFS, and Hive for data storage, retrieval, and processing.
  • Skills Needed:
  • Strong Python background with demonstrable experience in AI and machine learning.
  • Good knowledge of SQL for data manipulation, querying, and analysis.
  • Experience with designing and implementing data pipelines for various purposes, including ETL, model scoring, and automation.
  • Familiarity with Airflow, Domino, Control-M, Snowflake, Gitlab (or Git in general), and CI/CD Pipelines.
  • Understanding of Cloudera, Hadoop, HDFS, and Hive for big data processing and storage.

Required Qualifications
  • Strong Python Skills: Demonstrable experience in developing and deploying AI models using Python.
  • Proficiency in SQL: Extensive knowledge in SQL for complex data manipulation, querying, and analysis essential for AI model development and optimization.
  • Data Pipeline Design and Implementation: Experience in designing and maintaining robust data pipelines for ETL processes, model scoring, model performance monitoring, and data offloading.
  • Workflow Orchestration Tools: Proficiency with tools such as Airflow for workflow scheduling and Domino and Control-M for task automation and management.
  • Version Control and CI/CD: Skilled in using version control systems like GitLab or Git, with an understanding of Continuous Integration and Continuous Deployment pipeline activities.
  • Data Storage Technologies: Knowledge of data storage and processing technologies such as Snowflake, Cloudera, Hadoop, HDFS, and Hive.
  • Cross-Functional Collaboration: Ability to work effectively with cross-functional teams to integrate AI solutions into existing company infrastructure and workflows.
  • Problem Solving: Strong analytical and problem-solving skills with a capacity to handle complex challenges in data and AI model deployment environments.

Good to Have
  • Experience with StorageGrid, DBT, and Sagemaker is a plus.
  • Familiarity with Snowpark, APIs, Talon Batch, Kafka, and graph-based models.
  • Knowledge of building databases on Snowflake would be advantageous.
  • If you possess the required skills and are passionate about AI engineering, we encourage you to apply and join our team of talented professionals driving innovation in AI.

More jobs in San Antonio, Texas

Other
about 4 hours ago

Included Health
Other
about 4 hours ago

Compass Minerals
Other
about 4 hours ago

Cornerstone Search Group
More jobs in Other

Other
less than a minute ago

AF Group
Other
less than a minute ago

AF Group
Other
2 minutes ago

Radial Solutions Inc.