Analytics Engineer

Logo of LightFeather

LightFeather

πŸ“Remote - Worldwide

Job highlights

Summary

Join LightFeather as an Analytics Engineer and leverage your Airflow expertise to build and maintain robust data pipelines. You will collaborate with various teams to translate business needs into technical solutions, ensuring data quality and scalability. This full-time, remote position requires a minimum of 5 years of experience in a similar role and expertise in Apache Airflow. You will be responsible for optimizing data workflows, troubleshooting issues, and establishing best practices. LightFeather offers a chance to work on impactful projects within a diverse and inclusive environment.

Requirements

  • US Citizenship
  • Active clearance at the Public Trust level or above. IRS is preferred
  • Bachelor’s degree in Computer Science, Data Engineering, or a related field
  • Minimum of 5 years experience as an Analytics Engineer, Data Engineer, or a similar role
  • Expertise in Apache Airflow, including creating and managing complex DAGs
  • Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, Snowflake)
  • Experience with data warehousing concepts and technologies
  • Proficiency in at least one programming language (e.g., Python, Java, Scala)
  • Familiarity with cloud platforms (AWS, GCP, Azure) and related data services
  • Excellent problem-solving skills and attention to detail
  • Strong communication skills and ability to work collaboratively in a team environment

Responsibilities

  • Architect, build, and maintain scalable and efficient data pipelines using Apache Airflow to orchestrate workflows and automate complex processes
  • Partner with cross-functional teams to gather and analyze business requirements, translating them into technical specifications and scalable solutions
  • Develop and implement rigorous testing, validation, and monitoring strategies to ensure data quality, accuracy, and integrity throughout the pipeline
  • Optimize data workflows for enhanced performance, scalability, and cost-effectiveness, leveraging Airflow and related technologies
  • Proactively monitor and troubleshoot Airflow DAGs, identifying and resolving issues to minimize downtime and ensure seamless data operations
  • Establish and enforce best practices for ETL/ELT development, workflow orchestration, and data pipeline architecture
  • Continuously research and integrate the latest advancements in Airflow, workflow management, and data engineering technologies to improve existing systems
  • Create and maintain comprehensive documentation for all data pipelines, workflows, and configurations to facilitate knowledge sharing and system maintenance

Preferred Qualifications

  • Experience with big data tools and frameworks (e.g., Spark, Kafka)
  • Knowledge of data visualization tools (e.g., Tableau, Power BI, Looker)
  • Familiarity with version control systems like Git
  • Experience with CI/CD pipelines for data workflows

Benefits

This is a Full Time, Remote Position

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs

Please let LightFeather know you found this job on JobsCollider. Thanks! πŸ™