The goal of our group is to work on whatever technical alignment projects have the highest expected value. Our specific goal is to research and build robust lie detectors for LLMs. More about our research can be found here.