Senior Director of Data Engineering

phData Logo

phData

πŸ“Remote

Summary

Join phData, a leading innovator in the modern data stack, partnering with major cloud data platforms. We are a remote-first global company committed to helping global enterprises overcome their toughest data challenges. As a Senior Director of Data Engineering in LATAM (Brazil), you will play a critical role in driving delivery excellence and ensuring customer success. You will manage multiple work streams, build executive relationships, and guide clients through complex data challenges while designing scalable solutions. This role requires technical leadership, client engagement, and account growth, positioning you as a strategic influencer in the data landscape. We prioritize hiring not just brilliant engineers but technical consultants who excel in both engineering and client engagement.

Requirements

  • 10+ years as a hands-on Solutions Architect designing and implementing data solutions
  • 2+ years previous Consulting leadership experience working with external customers with the ability to multitask, prioritize tasks, frequently change focus, and work across a variety of projects
  • Technical Account Leadership, including developing a technology strategy and roadmap with the customer, leading delivery teams, and strong cross-collaboration with sales (and other internal teams)
  • Ability to develop end-to-end technical solutions into production β€” and to help ensure performance, security, scalability, and robust data integration
  • Programming expertise in Java, Python and/or Scala, SQL, and core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP
  • Demonstrated expertise in effectively leading and managing a team comprising Solution Architects and Data Engineers, fostering internal growth through coaching, mentoring, and performance management
  • Proven track record of collaborating with client stakeholders, technology partners, and cross-functional sales and delivery team members across distributed global teams, ensuring seamless, successful project delivery outcomes
  • Create strong cross-practice relationships to drive customer success
  • Exhibits a strong sense of ownership in resolving challenges, committed to ensuring exceptional outcomes for all aspects of project execution
  • Client-facing written and verbal communication skills and experience
  • Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.) including creating and delivering detailed client presentations

Responsibilities

  • Architect end-to-end data solutions, ensuring scalability, performance, and alignment with customer requirements and goals
  • Lead a team of data engineers to build data platforms, pipelines, and products that meet customer requirements and expectations
  • Communicate business value and ROI of data solutions to both technical and non-technical stakeholders
  • Mentor team members on best practices in data architecture, modeling, and technology. Stay abreast of emerging technologies and methodologies in data to inform solution design and delivery
  • Build relationships with client stakeholders to understand their long-term data engineering goals and design roadmaps that align with business objectives
  • Craft persuasive technical approaches and value propositions to drive client data initiatives forward
  • Identify and drive account expansion opportunities by uncovering unmet client needs and where data can add value
  • Lead responses to RFIs and RFPs, crafting persuasive technical approaches and value propositions

Preferred Qualifications

  • 4-year Bachelor's degree in Computer Science or a related field preferred
  • Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
  • Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
  • Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies
  • Multiple data sources: (e.g. queues, relational databases, files, search, API)
  • Complete software development life cycle experience: including design, documentation, implementation, testing, and deployment
  • Automated data transformation and data curation: dbt , Spark, Spark streaming, automated pipelines
  • Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
  • Methodologies: Agile Project Management, Data Modeling (e.g. Kimball, Data Vault)

Benefits

  • Remote-First Work Environment
  • Casual, award-winning small-business work environment
  • Collaborative culture that prizes autonomy, creativity, and transparency
  • Competitive comp, excellent benefits, 4 weeks PTO plan plus 10 Holidays (and other cool perks)
  • Accelerated learning and professional development through advanced training and certifications

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs