Tredence focuses on last mile delivery of insights into actions by uniting its strengths in business analytics, data science, and software engineering. The largest companies across industries are engaging with Tredence and deploying its prediction and optimization solutions at scale -empowering end users to improve decision making. Headquartered in the San Francisco Bay Area, the company serves clients in the US, Canada, Europe, and SE Asia. Learn more at www.tredence.com or follow us.
Primary Skills
Databricks
PySpark
ETL
DWH
Responsibilities
Manage end to end delivery by Investigating problem areas, working cross-functionally with product manager & other stakeholders
Follow the Agile development methodology; think strategically and execute methodically
Develop and manage capacity and growth projection forecasts of the environment within budgets
Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements
Drive optimization, testing and tooling to improve quality of solutions
Manage teams that build and operate high volume distributed systems in a SaaS environment
Great at devising efficient processes that increase velocity and quality
Desired Experience/ Skills:
6+ years of relevant experience
At least two years of experience building and leading highly complex, technical engineering teams.
Strong hands-on experience in Data-bricks
Experience managing distributed teams preferred.
Strong technical experience in large distributed systems, Data Warehousing, Data Lake at scale Project management skills: financial/budget management, scheduling and resource management experience with medium and large-scale projects
Comfortable working with ambiguity and multiple stakeholders.
Comfortable working cross functionality with product management and directly with customers; ability to deeply understand product and customer personas.
Architecture Design Experience for Cloud and Non-cloud platforms
Expertise on Azure Cloud platform
Knowledge on orchestrating workloads on cloud
Ability to set and lead the technical vision while balancing business drivers
Strong experience with PySpark, Python programming
Proficiency with APIs, containerisation and orchestration is a plus
Other Skills/Qualifications:
Bachelor's and/or master's degree in computer science or equivalent experience.
Strong communication, analytical and problem-solving skills with a high attention to detail.