Summary
Join ProPublica, an independent, nonprofit newsroom, as a computational journalist to uncover abuses of power and betrayals of the public trust. You will leverage technology and data to identify and unlock impactful stories, collaborating with a talented team. This role requires journalism experience and coding skills, focusing on accountability-driven investigations. You will conduct data analysis, build data pipelines, and develop high-impact investigative projects. The position is full-time with benefits and offers remote work options or the choice of working from various ProPublica offices.
Requirements
- At least four years of experience working on data projects, ideally focused on accountability, in a newsroom
- Ability to bring an accountability lens to the issues of the day through quick investigations reported over a couple of weeks as well as longer projects that may require a few months (or more) of digging, using data and code to get there
- Experience building data pipelines and conducting rigorous analysis
- A solid understanding of the tools of data analysis, such as database management systems, statistics software like R, Google BigQuery, Python/pandas, Jupyter Notebooks, etc
- Experience with best practices in data journalism, including a keen and careful eye for detail, documentation and reproducibility
- Journalism experience and the ability to code
- A talent for translating complex topics clearly and compellingly to our audience
- The self-discipline to work independently, as well as an eagerness to work with teammates and local partner reporters on collaborative projects
- Ability to travel, as necessary, for assignments, team summits and training
- A deep desire to work on important accountability stories and an eagerness to help dig for the truth to help spur reform
Responsibilities
- Find, or build, the data that uncovers stories with an accountability lens
- Conduct analyses that detect bias, influence and other harms
- Explain our work and make it reproducible by writing engaging methodologies
- Develop high-impact investigative projects using code and data analysis as well as interviews, research and on-the-ground reporting
- Clean, bulletproof and spot-check data that underpins our investigations
- Scrape websites and wrangle data, including the unstructured, messy kind
- Consume APIs, including poorly designed ones
Preferred Qualifications
- Experience working with AI and machine learning, including generative AI, large language models and other forms of machine learning
- Familiarity with public data sources across several beats and experience requesting data through public records requests
- Comfort spinning many plates. Youβll need to stay organized, focused and proactive
Benefits
- Full time with benefits
- Remote applicants anywhere in the U.S. are welcome
- Offices in New York City, Washington, D.C.; Atlanta; Chicago; Phoenix; and Berkeley, California