採用
ABOUT LIQUID AI:
Spun out of MIT CSAIL, we build general-purpose AI systems that run efficiently across deployment targets, from data center accelerators to on-device hardware, ensuring low latency, minimal memory usage, privacy, and reliability. We partner with enterprises across consumer electronics, automotive, life sciences, and financial services. We are scaling rapidly and need exceptional people to help us get there.
THE OPPORTUNITY:
This is a rare chance to sit at the intersection of frontier foundation models and real-world deployment. You’ll own applied post-training work end-to-end for some of the world’s largest enterprises, while still contributing directly to Liquid’s core model development.
Unlike most roles that force a trade-off between customer impact and foundational work, this role gives you both: deep ownership over how models are adapted, evaluated, and shipped, and a direct line into the evolution of Liquid’s post-training stack.
If you care about data quality, evaluation, and making models actually work in production, this is a chance to shape how applied AI is done at a foundation-model company.
WHAT WE'RE LOOKING FOR:
We need someone who:
-
Takes ownership: Owns post-training projects end-to-end, from customer requirements through delivery and evaluation.
-
Thinks end-to-end: Can reason across data generation, training, alignment, and evaluation as a single system.
-
Is pragmatic: Optimises for model quality and customer outcomes over publications or theory.
-
Communicates clearly: Can translate between customer needs and internal technical teams, and push back when needed.
THE WORK
-
Act as the technical owner for enterprise customer post-training engagements.
-
Translate customer requirements into concrete post-training specifications and workflows.
-
Design and execute data generation, filtering, and quality assessment processes.
-
Run supervised fine-tuning, preference alignment, and reinforcement learning workflows.
-
Design task-specific evaluations, interpret results, and feed learnings back into core post-training pipelines.
DESIRED EXPERIENCE:
Must-have:
-
Hands-on experience with data generation and evaluation for LLM post-training.
-
Experience training or fine-tuning models using SFT, preference alignment, and/or RL.
-
Strong intuition for data quality and evaluation design.
-
Familiarity with alignment or RL techniques beyond basic supervised fine-tuning.
Nice-to-have:
-
Experience contributing to shared or general-purpose post-training infrastructure.
-
Prior exposure to customer-facing or applied ML delivery environments.
-
Familiarity with alignment or RL techniques beyond basic supervised fine-tuning.
WHAT SUCCESS LOOKS LIKE (YEAR ONE)
-
Independently owns and delivers enterprise post-training projects with minimal oversight.
-
Is trusted by customers as the technical owner, demonstrating strong judgment and delivery quality.
-
Has made durable contributions to Liquid’s general-purpose post-training pipelines by feeding applied learnings back into baseline model development.
WHAT WE OFFER:
-
Real ML work: You will fine-tune models, generate data, and ship solutions, not configure API calls. Your work feeds directly back into our core model development.
-
Compensation: Competitive base salary with equity in a unicorn-stage company
-
Health: We pay 100% of medical, dental, and vision premiums for employees and dependents
-
Financial: 401(k) matching up to 4% of base pay
-
Time Off: Unlimited PTO plus company-wide Refill Days throughout the year
総閲覧数
1
応募クリック数
0
模擬応募者数
0
スクラップ
0
類似の求人
Liquid AIについて

Liquid AI
Series ALiquid AI is an artificial intelligence company focused on developing liquid neural networks and dynamic AI systems. The company specializes in creating adaptive neural architectures inspired by biological systems.
51-200
従業員数
Cambridge
本社所在地
給与レンジ
4件のデータ
Staff/L6
Staff/L6 · GTM STAFF - STRATEGIC PARTNERSHIPS
1件のレポート
$455,000
年収総額
基本給
$350,000
ストック
-
ボーナス
-
$455,000
$455,000
ニュース&話題
Vertiv Stock: The $15 Billion Backlog, Liquid Cooling Dominance, And The AI Trade (VRT) - Seeking Alpha
Seeking Alpha
News
·
1w ago
Taiwan cooling suppliers post record March revenue as AI demand lifts liquid cooling - digitimes
digitimes
News
·
1w ago
Best practices for deploying liquid-cooled servers in AI data centers - Data Center Dynamics
Data Center Dynamics
News
·
1w ago
Liquid AI Releases LFM2.5-VL-450M: a 450M-Parameter Vision-Language Model with Bounding Box Prediction, Multilingual Support, and Sub-250ms Edge Inference - MarkTechPost
MarkTechPost
News
·
1w ago




