
Senior Software Developer

theScore
Summary
Join PENN Entertainment's Sports Modeling Automation Team as a Backend Engineer and contribute to the development of cutting-edge online gaming and sports media products. You will design, implement, and maintain backend services and APIs using Python, build and manage complex data workflows with Argo Workflows, and develop event-driven distributed systems. You will also work in containerized environments using Docker and Kubernetes, build internal tools and libraries, and collaborate with data science and data engineering teams. This role requires strong computer science fundamentals, experience with modern web frameworks and API development, proficiency in Python, and hands-on experience with workflow orchestration tools like Argo Workflows. You will also need database proficiency, familiarity with containerization, and excellent problem-solving and communication skills.
Requirements
- Strong Computer Science Foundation: Solid understanding of data structures, distributed systems, and software design
- Passionate About Clean Code: Commitment to clean architecture and software craftsmanship
- Versatile Developer: Experience with modern web frameworks and API development
- Adaptable Learner: Proficiency in Python with a willingness to learn new technologies and frameworks
- Hands-on experience with workflows orchestration tools such as Argo Workflows (or Airflow)
- Database Proficiency: Strong experience with relational databases such as PostgreSQL and MySQ and NoSQL database such as BigTable, Mongo, DynamoDB
- Comfortable with Command Line: Proficient in terminal operations
- Familiar with Containerization: Knowledge of Kubernetes and container orchestration
- Caching Knowledge: Understanding of caching strategies and tools
- Problem-Solving Skills: Excellent analytical abilities and independent troubleshooting
- Strong Communicator: Ability to convey complex technical concepts to both technical and non-technical stakeholders
Responsibilities
- Design, implement, and maintain backend services and APIs using Python (primarily FastAPI or Flask)
- Build and manage complex data workflows with Argo Workflows (Kubernetes-native workflow engine supporting DAG and step-based workflows)
- Develop event-driven distributed systems that process large amounts of data and integrate with downstream back end services (Kafka experience preferred or any other event streaming/message queue platform)
- Work in containerized environments using Docker and Kubernetes
- Build internal tools and libraries to help accelerate other backend teams
- Work with data science and data engineering teams to build best-in-class SDLC processes
- Oversee the design and maintenance of data systems and contribute to the continual enhancement of the data platform
- Collaborate with the team to define, track, and meet SLOs
- Maintain and expand existing systems, tooling and infrastructure
- Ensure System Reliability: Implement robust monitoring and alerting mechanisms using tools like DataDog
- Participate in Agile Processes: Engage in the design, architecture, and delivery of new features within a collaborative agile/scrum environment
- Deploy to Cloud Infrastructure: Manage deployments of services and applications to our cloud platforms
- Strategic Partnership: Work closely with the tech lead and engineering manager to help set the team's direction
- Demonstrate Technical Proficiency: Showcase expertise in the team's tech stack, tooling, and architecture to lead wide-ranging projects effectively
- On-Call Rotation: Participate in our on-call rotation to address critical issues during off-business hours
Preferred Qualifications
Knowledge of other programming language (e.g, Elixir, Java, GO)
Benefits
- Competitive compensation package
- Comprehensive Benefits package
- Fun, relaxed work environment
- Education and conference reimbursements
Share this job:
Similar Remote Jobs


