GenAI Staff Machine Learning Engineer, AI Runtime
Databricks
P-984
Founded in late 2020 by a small group of machine learning engineers and researchers, MosaicML enables companies to securely fine-tune, train and deploy custom AI models on their own data, for maximum security and control. Compatible with all major cloud providers, the MosaicML platform provides maximum flexibility for AI development. Introduced in 2023, MosaicML’s pretrained transformer models have established a new standard for open source, commercially usable LLMs and have been downloaded over 3 million times. MosaicML is committed to the belief that a company’s AI models are just as valuable as any other core IP, and that high-quality AI models should be available to all.
Now part of Databricks since July 2023, we are passionate about enabling our customers to solve the world's toughest problems — from making the next mode of transportation a reality to accelerating the development of medical breakthroughs. We do this by building and running the world's best data and AI platform so our customers can use deep data insights to improve their business. We leap at every opportunity to solve technical challenges, striving to empower our customers with the best data and AI capabilities.
Summary:
We are seeking top-tier Machine Learning Engineers to lead the charge in empowering businesses to deploy LLMs and advanced generative models in production. Your role will involve building and sustaining Mosaic’s ML Runtime, which enables customers to harness their unique data to develop models with optimal quality, performance, and cost. You will contribute to MosaicML’s open-source initiatives, sharing MosaicML’s innovations with the community, and tackle multifaceted challenges across the machine learning stack, from kernels and software to algorithms and new state of the art models.
You will:
- Design and implement tooling and open source technologies to enable the development of automated ML pipelines for data preprocessing, model training, hyperparameter tuning, and model evaluation for Databricks' customers.
- Design and implement robust, scalable ML infrastructure and model serving components, to support seamless integration of AI/ML models into customer production environments.
- Implement advanced optimization techniques to reduce the resource footprint of models while preserving their performance and balancing usability for our developers and customers.
- Collaborate with product managers and cross-functional teams to drive technology-first initiatives that enable novel business strategies and product roadmaps.
- Facilitate our user community through documentation, talks, tutorials, and collaborations.
- Contribute to the broader AI community by publishing research, presenting at conferences, and actively participating in open-source projects, enhancing Databricks' reputation as an industry leader.
We look for:
- 4+ years of full time industry experience
- Experience with deep learning frameworks (e.g. PyTorch, TensorFlow)
- Experience with GPUs (e.g., Nvidia, AMD) and alternative deep learning accelerators
- Strong sense of design and usability
- Effective communication skills and the ability to articulate complex technical ideas to cross-disciplinary internal and external stakeholders.
- Prior history of contributing to or developing open source projects is a bonus but not a requirement
We value candidates who are curious about all parts of the company's success and are willing to learn new technologies along the way.
Pay Range Transparency
Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks utilizes the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.
About Databricks
Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.
Our Commitment to Diversity and Inclusion
At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.
Compliance
If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.