Founded in late 2020 by a small group of machine learning engineers and researchers, MosaicML enables companies to securely fine-tune, train and deploy custom AI models on their own data, for maximum security and control. Compatible with all major cloud providers, the MosaicML platform provides maximum flexibility for AI development. Introduced in 2023, MosaicML’s pretrained transformer models have established a new standard for open source, commercially usable LLMs and have been downloaded over 3 million times. MosaicML is committed to the belief that a company’s AI models are just as valuable as any other core IP, and that high-quality AI models should be available to all.
Now part of Databricks since July 2023, we are passionate about enabling our customers to solve the world's toughest problems — from making the next mode of transportation a reality to accelerating the development of medical breakthroughs. We do this by building and running the world's best data and AI platform so our customers can use deep data insights to improve their business. We leap at every opportunity to solve technical challenges, striving to empower our customers with the best data and AI capabilities.
You will:
- Explore and analyze performance bottlenecks in ML training and inference
- Design, implement and benchmark libraries and methods to overcome aforementioned bottlenecks
- Build tools for performance profiling, analysis, and estimation for ML training and inference
- Balance the tradeoff between performance and usability for our customers
- Facilitate our community through documentation, talks, tutorials, and collaborations
- Collaborate with external researchers and leading AI companies on various efficiency methods
We look for:
- Hands-on experience with the internals of deep learning frameworks (e.g. PyTorch, TensorFlow) and deep learning models
- Experience with high-performance linear algebra libraries such as cuDNN, CUTLASS, Eigen, MKL, etc.
- General experience with the training and deployment of ML models
- Experience with compiler technologies relevant to machine learning
- Experience with distributed systems development or distributed ML workloads
- Hands-on experience with writing CUDA code and knowledge of GPU internals (Preferred)
- Publications in top tier ML or System Conferences such as MLSys, ICML, ICLR, KDD, NeurIPS (Preferred)
We value candidates who are curious about all parts of the company's success and are willing to learn new technologies along the way.
Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above.
Local Pay Range: $150,000–$190,000 USD
About Databricks:
Databricks is the data and AI company. More than 9,000 organizations worldwide — including Comcast, Condé Nast, and over 50% of the Fortune 500 — rely on the Databricks Lakehouse Platform to unify their data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe. Founded by the original creators of Apache Spark™, Delta Lake and MLflow, Databricks is on a mission to help data teams solve the world’s toughest problems. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.
Our Commitment to Diversity and Inclusion:
At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.
Compliance:
If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.