Full Stack Data Collection Lead

Two Six Technologies Logo

Two Six Technologies

πŸ’΅ $120k-$195k
πŸ“Remote - United States

Summary

Join Two Six Technologies and leverage your full-stack engineering skills to build and implement innovative data collection solutions. You will lead a team in designing, developing, and supporting scalable data collection systems using cutting-edge web scraping technologies. As a lead engineer, you will collaborate with architects, data scientists, and DevOps engineers to deliver high-quality, robust code. The ideal candidate possesses extensive experience with Python, AWS, and containerization technologies, along with a deep understanding of web technologies. You will be responsible for overseeing the codebase, setting coding standards, and mentoring junior developers. This role requires a passion for solving complex problems and a commitment to continuous learning and improvement.

Requirements

  • Experience with the design, development, testing, and support of scalable, data-driven applications written in Python and deployed to production SaaS environments
  • Experience with web scraping and other techniques for collecting data from the internet
  • Deep understanding of HTTP, HTML, JavaScript, CSS, and other technologies that power websites
  • Experience with AWS or similar cloud-based infrastructure
  • Experience as a technical lead or key resource for an application, capability, or code base
  • Experience working independently and as a part of an Agile team

Responsibilities

  • Leading the development of high volume data collection solutions that enable analysis by both human experts and machine learning solutions
  • Rapidly develop software as an individual contributor at a high level and be an exemplar of quality, scalable, robust code
  • Creating proofs-of-concept and prototypes to quickly test ideas, as well as designing and building scalable, production-ready solutions
  • Collaborating with architects and Product leads on the roadmap for data collection capabilities
  • Staying current with the latest technological developments in web scraping and the challenges of maintaining collection as data sources quickly evolve
  • Operating in a collaborative environment, with a focus on taking action and working closely with peers to enable team success
  • Overseeing the data collection codebase, setting and maintaining coding standards, patterns, and best practices
  • Supporting the data collection capability, researching and responding quickly to issues, adapting collection to meet new challenges
  • Considering security best practices at every phase of software development
  • Continuously learning and improving
  • Working within Agile software development methodologies and being an active part of our continuous improvement efforts
  • Working in a fully remote team with a diverse set of skills and experiences
  • Independently identifying and solving problems, and questioning assumptions in pursuit of the right solutions
  • Actively participating in the peer code review process, both providing feedback, but also being open and receptive to feedback
  • Creating and executing both manual and automated testing

Preferred Qualifications

  • Experience with Scrapy or other web scraping technologies is a huge plus
  • Experience with Docker and Kubernetes
  • Experience with Object Oriented Programming (OOP)
  • Experience with database technologies, ideally Elastic and Postgres
  • AWS certification

Benefits

  • Medical, dental, and vision insurance
  • Life and disability insurance
  • Retirement benefits
  • Paid leave
  • Tuition assistance and professional development

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.